Staff/Principal Data Engineer - Cyber Security SaaS (Remote Europe)
Wave Talent is seeking a Staff/Principal Data Engineer for a fully remote role in Europe. You will architect and build large-scale data systems for a well-funded cyber security SaaS startup, focusing on AI-driven vulnerability detection. This role requires deep expertise in data pipelines, cloud services, and lakehouse technologies.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
π» Job Title: Data Engineer (Staff/Principal)
π° Salary: up to ~Β£135k / β¬155k
π Equity: up to 0.35% (valued at ~$450k)
π Location: fully remote anywhere in Europe (quarterly team get togethers)
π Company: Cyber Security SaaS start-up building AI agents to identify and resolve cloud vulnerabilities
π₯ Team: ~30
πΈ Funding: $30m+ (Series A)
Weβre looking for a Staff/Principal level Data Engineer to take ownership of large-scale data systems that power AI-driven vulnerabilities and threat detection, at a well-funded start up operating at the intersection of cyber security, AI and distributed systems.
Joining as one of the early team members, youβll architect and build production-grade data pipelines capable of handling massive data volumes with high performance, reliability and cost efficiency in mind. Youβll play a critical part in designing the data lake architecture, real-time ingestion pipelines and transformation systems that enable our platform to process security and telemetry data at scale.
This role is ideal for an engineer who excels at designing and building large-scale data platforms, approaches problems with a systems-architecture mindset and enjoys tackling the deep technical challenges of processing complex, high-volume datasets for enterprise-scale environments.
β Must have requirements:
- Strong software engineering background and a mastery of Python
- Proven experience designing, building and operating pipelines that process gigabytes to terabytes of data per day in production environments.
- Extensive experience with complex systems architecture and high proficiency with Kafka and Spark
- Deep expertise with AWS data services including S3, EMR and building data lakes at scale
- Production experience with Apache Iceberg or similar lakehouse technologies (e.g. Delta, Hudi) and experience designing efficient, large-scale data storage systems.
π Bonus points for:
- Experience with Temporal.io for workflow orchestration
- Familiarity with file formats like Parquet, ORC or Hudi for optimised data storage
- Experience optimising relational databases (e.g. PostgreSQL, RDS)
- Previous experience in big tech/companies moving massive data volumes as well as rapidly scaling start-ups
- Domain expertise in AI and/or Cyber Security, ideally understanding of vulnerability management