Senior GCP Data Engineer

acme services • India
Remote
Apply
AI Summary

Design, develop, and maintain scalable data pipelines on GCP. Build and optimize data ingestion, ETL/ELT processes, and distributed data processing workflows. Collaborate with analytics, BI, and application teams to deliver data solutions aligned with business needs.

Key Highlights
Design and maintain scalable data pipelines on GCP
Build and optimize data ingestion, ETL/ELT processes, and distributed data processing workflows
Collaborate with analytics, BI, and application teams
Technical Skills Required
GCP BigQuery Dataproc Dataflow Python SQL PySpark
Benefits & Perks
Competitive salary
Hybrid work arrangement (Gurgaon and remote)

Job Description


Job Title: GCP Data Engineer

Experience: 5 to 9 Years

Budget: 15 to 20 LPA

Location: Gurgaon (Hybrid)

Relocation works as the interviews will be virtual


Skills: GCP, BigQuery, Dataproc, Dataflow, Python, SQL


Role Overview

We are looking for an experienced GCP Data Engineer with strong hands-on expertise in building, managing, and optimizing large-scale data pipelines on Google Cloud Platform. The ideal candidate should have solid experience with BigQuery, Dataproc, Dataflow, and end-to-end data engineering workflows.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines on GCP using BigQuery, Dataproc, Dataflow, Pub/Sub, and Cloud Storage.
  • Build and optimize data ingestion, ETL/ELT processes, and distributed data processing workflows.
  • Work with large datasets to ensure high performance, reliability, and data quality within data platforms.
  • Implement data transformation logic using Python, SQL, PySpark or similar technologies.
  • Optimize BigQuery queries, partitioning, clustering, and cost management.
  • Collaborate with analytics, BI, and application teams to deliver data solutions aligned with business needs.
  • Ensure security, governance, and compliance across the GCP data ecosystem.


Required Skills & Experience

  • 5–10 years of total experience with minimum 3+ years in GCP data engineering.
  • Strong hands-on experience in BigQuery, Dataproc, Dataflow, Cloud Composer/Airflow.
  • Strong SQL and Python skills; experience in Spark or PySpark preferred.
  • Good understanding of data warehousing concepts, ETL/ELT architectures, and distributed data processing.
  • Experience with CI/CD pipelines, Git, and DevOps practices in cloud environments.
  • Knowledge of Pub/Sub, Cloud Storage, Cloud Functions will be an added advantage.

Subscribe our newsletter

New Things Will Always Update Regularly