Seeking a Senior Data Engineer to build and evolve cloud-native data lake architectures on AWS. Responsibilities include developing ETL/ELT pipelines, implementing real-time analytics, and ensuring data reliability. Requires strong Python, AWS, and data engineering tool experience.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
Senior Data Engineer – Product-Focused Scale-Up – Germany (Remote)
Up to €85,000
Germany – Full Remote
German-speaking (C1)
We're pleased to be supporting an exciting German scale-up as they look to make their first dedicated Data Engineer hire. Until now, Data Engineering has been handled by Backend Engineering. Wth ambitious plans to scale, you'll be crucial to building out cloud-native data lake architectures, ETL, and real-time analytics implementations.
What you’ll work on
- You’ll contribute to the design, build-out, and ongoing evolution of a cloud-native data lake running on AWS, using services such as S3, Glue, Athena, and Iceberg.
- You’ll create and maintain robust ETL/ELT workflows using Apache Airflow, dbt, and Apache Spark, handling everything from real-time event ingestion to curated datasets following a medallion-style architecture.
- You’ll own the implementation of a real-time game analytics and event tracking platform across our gaming ecosystem, capturing player activity, system performance, and commercial KPIs.
- You’ll define and enforce data reliability standards through automated validation (dbt tests, Great Expectations) and build monitoring to ensure availability and correctness.
- You’ll partner closely with backend, frontend, statistics, mathematics, and finance teams to deliver fast, reliable analytics and actionable reporting.
- You’ll improve performance and cost efficiency across storage and query layers, and automate infrastructure and data workflows using Infrastructure as Code principles.
What you bring
- A degree in computer science or a comparable technical background.
- Roughly 2–4+ years of hands-on experience in data engineering or a closely related role.
- Strong Python skills, particularly with pandas, pyarrow, and boto3.
- Solid experience working with cloud platforms, ideally AWS, and modern data lake stacks.
- Prior exposure to messaging systems (e.g. RabbitMQ) and workflow orchestration tools such as Airflow.
- Practical knowledge of columnar data formats like Parquet and table formats such as Apache Iceberg or Delta Lake.
- Experience with dbt, Spark, or comparable transformation frameworks; familiarity with BI tools like Tableau or Metabase is a plus.
- A clear understanding of data modeling, partitioning, and performance tuning in analytical systems.
- A proactive attitude toward AI-driven systems and willingness to take ownership of internal AI solutions and their ongoing improvement.
- Bonus points if you’ve worked in gaming and understand game-specific data, events, and metrics.
- Fluent German (C1 level) and solid English.
📩 Apply directly, or get in touch for a confidential chat.
Sorry, no visa sponsorship available for this role. Fully remote within Germany.
#DataEngineer #DataEngineering #RemoteJobs #GermanyJobs #Hiring #AWS #DataLake
Similar Jobs
Explore other opportunities that match your interests
Dabster
super hire staff