We are seeking a Data Engineer to support and execute the migration of ETL pipelines from Matillion to Apache Airflow. This role focuses on rebuilding pipelines, improving reliability, and enabling scalable, code-based data workflows. The ideal candidate will have 3+ years of experience in Data Engineering or ETL development.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Job Title: ETL Migration & Pipeline DevelopmentDuration 2‑Year Contract
Location: 100% Remote
Client SOMOS
We are seeking a Data Engineer contractor (100% remote) to support and execute the migration of ETL pipelines from Matillion to Apache Airflow. This role focuses on rebuilding pipelines, improving reliability, and enabling scalable, code-based data workflows.
This is a hands-on role requiring someone who can ramp quickly, work with existing ETL logic, and contribute within the first 1–2 weeks.
Responsibilities:
- Migrate existing ETL pipelines from Matillion to Apache Airflow while preserving logic and dependencies
- Develop and maintain Airflow DAGs with proper scheduling, retries, and failure handling
- Reverse engineer existing Matillion jobs and translate them into Python-based workflows
- Build and optimize data pipelines across systems such as S3, Snowflake, and relational databases
- Perform data validation and reconciliation between source and target systems
- Write and optimize SQL transformations for large-scale datasets
- Implement monitoring, alerting, and error handling for pipelines
- Collaborate with data, platform, and analytics teams to ensure smooth migration and deployment
- Document pipelines, workflows, and operational processes
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Required Qualifications:
- 3+ years of experience in Data Engineering or ETL development
- Strong hands-on experience with Apache Airflow or similar orchestration tools
- Proficiency in Python for building data pipelines and workflows
- Strong SQL skills and experience working with large datasets
- Experience with cloud platforms (AWS preferred: S3, RDS/Aurora, IAM)
- Experience with cloud data warehouses (Snowflake preferred or similar)
- Experience building and maintaining ETL/ELT pipelines
- Familiarity with Git and CI/CD workflows
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Preferred Qualifications:
- Experience migrating ETL workflows from tools like Matillion, Informatica, Talend, or SSIS to Airflow
- Experience with Airflow in containerized environments (Docker, Kubernetes/EKS)
- Familiarity with data validation, reconciliation, and pipeline testing strategies
- Experience with monitoring tools such as Datadog or CloudWatch
- Understanding of data modeling or medallion architecture
Similar Jobs
Explore other opportunities that match your interests
Why Hiring
trava health