We're looking for a Senior Data Engineer to shape and scale our open lakehouse data platform built on Snowflake and Apache Iceberg tables on AWS S3. You'll own end-to-end design and evolution of robust, scalable pipelines that power analytics, ML, and customer-facing features across our SaaS digital advertising products. You'll collaborate with Product and Engineering to keep systems performant, reliable, and business aligned.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Factor Eleven's tech department is the inhouse tech scale-up responsible for our SaaS product suite offering localized digital advertising to enterprises of all sizes and shapes. We're powering the engine that Factor Eleven is successfully built upon and elevate the possibilities of our product on a daily basis. We're working together to fulfill our ambitions as a top ad-tech provider by continuously leveling up the quality and expanding capabilities of the entire platform, as well as our engineering and product organization.
Join our amazing team in our mission to move digital localized advertisement forward and enjoy the freedom, camaraderie and perks of our fully remote operations.
YOUR MISSION
We are looking for a Senior Data Engineer to shape and scale our open lakehouse data platform built on Snowflake (compute) and Apache Iceberg tables on AWS S3. You’ll own end-to-end design and evolution of robust, scalable pipelines that power analytics, ML, and customer-facing features across our SaaS digital advertising products.
You’ll collaborate with Product and Engineering to keep systems performant, reliable, and business aligned. You’ll ship hands-on, lead technical design, run code reviews, and mentor others in modern data engineering practices.
To thrive here, you need
- Deep expertise in open lakehouse architectures – Snowflake as compute + Apache Iceberg on cloud object storage
- Hands-on production experience with:
- Iceberg catalog management (Apache Polaris, Glue, or Hive)
- Time-travel / snapshot queries
- Partition evolution & schema evolution safety
- Snowflake Iceberg external tables, query tuning, clustering, cost control, RBAC/masking
- Strong production proficiency in dbt – authoring complex models, incremental logic, snapshots, exposures, custom tests, and CI/CD integration using dbt Core + Snowflake/Iceberg adapters
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
- Experience working with AWS Cloud Platform
- DataOps / IaC (Terraform, dbt Cloud)
- Real-time streaming (Apache Kafka/Flink, AWS Kinesis)
Your Responsibilities
- Design, build, and maintain scalable data pipelines, architectures, and platforms with a focus on reliability and efficiency
- Implement ETL/ELT processes with rigorous quality checks and governance to ensure data accuracy and consistency
- Mentor data engineers, share best practices, and foster a culture of learning and ownership
- Partner with Engineering, Product, and Business to translate requirements into high-impact data solutions
- Own project execution end-to-end—scoping, estimation, delivery, and communication
- Champion testing, documentation, and observability through design reviews and technical leadership
- Stay ahead of industry trends in cloud data, big data processing, and real-time analytics.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
- 5+ years in data engineering, with hands-on production experience building open lakehouses using Snowflake + Apache Iceberg
- Strong production track record with dbt – complex models, dependencies, incremental logic, custom tests, CI/CD
- Advanced SQL + Python; you build idempotent, observable, schema-safe pipelines
- Deep knowledge of data modelling trade-offs, distributed systems, and big data frameworks
- Excellent communicator – you distil complex topics for technical and non-technical audiences with empathy
- Proven collaborator with strong problem-solving, mentoring, and project management skills
- (Bonus) Built and maintained a production-grade open lakehouse from scratch (Iceberg + catalog + compute)
- (Bonus) Familiar with DataOps, IaC, or real-time streaming pipelines.
Similar Jobs
Explore other opportunities that match your interests
f11
GCS