Staff/Principal Data Engineer - Cyber Security SaaS (Remote Europe)

Wave Talent β€’ European Union
Remote
This Job is No Longer Active This position is no longer accepting applications
AI Summary

Wave Talent is seeking a Staff/Principal Data Engineer for a fully remote role in Europe. You will architect and build large-scale data systems for a well-funded cyber security SaaS startup, focusing on AI-driven vulnerability detection. This role requires deep expertise in data pipelines, cloud services, and lakehouse technologies.

Key Highlights
Architect and build production-grade data pipelines for massive data volumes.
Design and implement data lake architecture and real-time ingestion systems.
Work at the intersection of cyber security, AI, and distributed systems.
Technical Skills Required
Python Kafka Spark AWS S3 AWS EMR Apache Iceberg Delta Lake Hudi Parquet ORC PostgreSQL AWS RDS
Benefits & Perks
Salary up to ~Β£135k / €155k
Equity up to 0.35% (valued at ~$450k)
Fully remote anywhere in Europe
Quarterly team get togethers

Job Description


πŸ’» Job Title: Data Engineer (Staff/Principal)

πŸ’° Salary: up to ~Β£135k / €155k

πŸ“ˆ Equity: up to 0.35% (valued at ~$450k)

πŸ“ Location: fully remote anywhere in Europe (quarterly team get togethers)

πŸ” Company: Cyber Security SaaS start-up building AI agents to identify and resolve cloud vulnerabilities

πŸ‘₯ Team: ~30

πŸ’Έ Funding: $30m+ (Series A)


We’re looking for a Staff/Principal level Data Engineer to take ownership of large-scale data systems that power AI-driven vulnerabilities and threat detection, at a well-funded start up operating at the intersection of cyber security, AI and distributed systems.


Joining as one of the early team members, you’ll architect and build production-grade data pipelines capable of handling massive data volumes with high performance, reliability and cost efficiency in mind. You’ll play a critical part in designing the data lake architecture, real-time ingestion pipelines and transformation systems that enable our platform to process security and telemetry data at scale.


This role is ideal for an engineer who excels at designing and building large-scale data platforms, approaches problems with a systems-architecture mindset and enjoys tackling the deep technical challenges of processing complex, high-volume datasets for enterprise-scale environments.


βœ… Must have requirements:

  • Strong software engineering background and a mastery of Python
  • Proven experience designing, building and operating pipelines that process gigabytes to terabytes of data per day in production environments.
  • Extensive experience with complex systems architecture and high proficiency with Kafka and Spark
  • Deep expertise with AWS data services including S3, EMR and building data lakes at scale
  • Production experience with Apache Iceberg or similar lakehouse technologies (e.g. Delta, Hudi) and experience designing efficient, large-scale data storage systems.


πŸ‘ Bonus points for:

  • Experience with Temporal.io for workflow orchestration
  • Familiarity with file formats like Parquet, ORC or Hudi for optimised data storage
  • Experience optimising relational databases (e.g. PostgreSQL, RDS)
  • Previous experience in big tech/companies moving massive data volumes as well as rapidly scaling start-ups
  • Domain expertise in AI and/or Cyber Security, ideally understanding of vulnerability management

Subscribe our newsletter

New Things Will Always Update Regularly