Design and build scalable data pipelines, support cloud data warehouse architecture, and optimize data quality.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Overview
We are looking for a Data Engineer to design and build scalable data pipelines while supporting the architecture and optimization of a modern cloud data warehouse. This role will focus on ingesting data from multiple sources, transforming it into reliable datasets, and ensuring it is efficiently structured for downstream analytics and reporting.
This is a 6 month, full remote, rolling contract.
This role is paying $70-$80 W2 $80-$90 LLC
Key Responsibilities
- Design and develop scalable ETL/ELT pipelines to ingest data from APIs, SaaS platforms, and internal systems
- Build and maintain data workflows for batch and near-real-time processing
- Develop data transformation logic using SQL and Python within the warehouse environment
- Optimize data warehouse performance, storage design, and query efficiency
- Implement data quality checks and monitoring across ingestion and transformation layers
- Collaborate with analytics teams to ensure datasets are reliable and properly structured for downstream use
- Maintain documentation and best practices for pipeline development and warehouse modeling
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Required Experience
- Strong experience building data pipelines using Python and SQL
- Hands-on experience with modern cloud data warehouses (Snowflake, BigQuery, or Redshift)
- Experience with workflow orchestration tools such as Airflow or Prefect
- Experience processing large datasets using Spark or distributed data frameworks
- Strong understanding of data modeling and warehouse design principles
- Experience integrating data from APIs, databases, and SaaS applications
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Nice to Have
- Experience working in high-volume data environments
- Familiarity with dbt for transformation workflows
- Exposure to machine learning data pipelines
Tech Stack Example
- Python
- SQL
- Snowflake / BigQuery
- Airflow
- Spark / PySpark
- Cloud (AWS or GCP)
Similar Jobs
Explore other opportunities that match your interests
Insight Global
Senior Data Scientist
adelaide