Analyze the requirement for data transformation for target downstream system and implement the corresponding set of transformations on ingested data.
Analyze source data and implement required ETL into Google cloud storage/BigQuery using GCP tools like Dataproc, Dataflow.
- Good programming experience in Python, Pyspark, Unix Shell Scripting.
- Experience with working with database and SQL Working knowledge of Google cloud Technologies.
- Exposure in working with Data lake/BigData technologies ( Batch processing/Streaming ) preferred.
- Experience of working in agile methodologies is preferred Good written and verbal communication skills.