Job Description:

Analyze the requirement for data transformation for target downstream system and implement the corresponding set of transformations on ingested data.

Analyze source data and implement required ETL into Google cloud storage/BigQuery using GCP tools like Dataproc, Dataflow.

Required Skills:

  • Good programming experience in Python, Pyspark, Unix Shell Scripting.
  • Experience with working with database and SQL  Working knowledge of Google cloud Technologies.
  • Exposure in working with Data lake/BigData technologies ( Batch processing/Streaming ) preferred.
  • Experience of working in agile methodologies is preferred  Good written and verbal communication skills.
Job Location: Pune
Experience (Years): 3-6 Years

Apply for this position

Allowed Type(s): .pdf, .doc, .docx, .rtf