GCP Data Engineer

Coltech United State
Remote
Apply
AI Summary

Join a high-impact data transformation programme as a GCP Data Engineer. Build and maintain scalable data pipelines, work with large datasets, and solve real-world data challenges in GCP. Proven experience as a Data Engineer working on GCP is required.

Key Highlights
Hands-on GCP experience required
12-month contract
Fully Remote (US timezone)
Key Responsibilities
Build and maintain scalable data pipelines (batch & real-time)
Work with large datasets in BigQuery, optimising performance and cost
Develop data processing solutions using Dataflow (Apache Beam)
Integrate data using Pub/Sub and Cloud Storage
Support data modelling and transformation workflows
Collaborate with engineers, analysts, and business stakeholders
Technical Skills Required
BigQuery Dataflow (Apache Beam) Pub/Sub Cloud Storage Cloud Composer / Dataform
Benefits & Perks
$100–$120/hour
Fully Remote (US timezone)
12 months (strong likelihood of extension)
Nice to Have
Experience migrating data platforms to GCP
BigQuery cost optimisation experience
Exposure to CI/CD or infrastructure as code (e.g., Terraform)

Job Description


I’m looking for an experienced GCP Data Engineer to join a high-impact data transformation programme on a 12-month contract.


Hands on GCP experience required


💰 $100–$120/hour

🌍 Fully Remote (US timezone)

📅 12 months (strong likelihood of extension)


🔍 The Role


You’ll be hands-on building and optimising a modern data platform on Google Cloud Platform (GCP), working closely with architects and stakeholders to deliver scalable data solutions.


This role is ideal for someone who enjoys building robust pipelines, working with large datasets, and solving real-world data challenges in GCP.


🧠 Key Responsibilities


  • Build and maintain scalable data pipelines (batch & real-time)
  • Work with large datasets in BigQuery, optimising performance and cost
  • Develop data processing solutions using Dataflow (Apache Beam)
  • Integrate data using Pub/Sub and Cloud Storage
  • Support data modelling and transformation workflows
  • Collaborate with engineers, analysts, and business stakeholders
  • Follow best practices in data quality, testing, and deployment


☁️ Tech Stack (GCP-focused)


BigQuery · Dataflow (Apache Beam) · Pub/Sub · Cloud Storage · Cloud Composer / Dataform

  • Strong preference for candidates with hands-on GCP experience (not just general cloud exposure)


✅ Requirements


  • Proven experience as a Data Engineer working on GCP
  • Strong experience building data pipelines at scale
  • Experience with both batch and real-time processing
  • Solid SQL skills and experience with data warehousing (BigQuery)
  • Comfortable working in a fast-paced, collaborative environment


⭐ Nice to Have

  • Experience migrating data platforms to GCP
  • BigQuery cost optimisation experience
  • Exposure to CI/CD or infrastructure as code (e.g., Terraform)


📩 Interested?


Drop me a message or comment below, and I’ll reach out with more details.

#Hiring #DataEngineer #GCP #GoogleCloud #BigData #RemoteJobs #ContractJobs


Similar Jobs

Explore other opportunities that match your interests

Director, Strategy & Operations

Data Science
31m ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

fetch

United State

Business Analyst - Life Insurance

Data Science
59m ago
Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Mid-Senior level

rais usa

United State

Senior Data Engineer

Data Science
1h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Takeda

United State

Subscribe our newsletter

New Things Will Always Update Regularly