Data Engineer for Systematic Trading

tribus Hong Kong Sar
Visa Sponsorship Relocation
Apply
AI Summary

Data Engineer for Systematic Trading role involves designing and building lakehouse-style data platforms to support research, risk, and live trading use cases.

Key Highlights
Design and build lakehouse-style data platforms
Develop Python-based pipelines for batch and streaming workloads
Partner closely with quants and traders
Key Responsibilities
Design and build lakehouse-style data platforms supporting research, risk, and live trading use cases
Developing Python-based pipelines for batch and streaming workloads
Working with Spark for large-scale processing and Kafka / Flink for real-time ingestion and stream processing
Modelling and optimising data for high-concurrency analytical access (OLAP)
Operating in a cloud-native AWS environment, with a strong focus on performance, data quality, and reliability
Partnering closely with quants and traders to deliver production-grade data solutions
Technical Skills Required
Python SQL Spark Kafka Flink
Benefits & Perks
Relocation & Sponsorship available
Nice to Have
C++
Rust

Job Description


Data Engineer

Systematic Trading

Hong Kong (Relocation & Sponsorship available)


We’re working with a top-tier global systematic trading firm building out a central data platform used directly by multiple live trading desks.


This role sits right at the intersection of data engineering, cloud infrastructure, and quantitative research. You’ll be building and evolving large-scale data pipelines that ingest, process, and serve high-volume market and trading data under real production constraints.


What you’ll be doing

  • Design and build lakehouse-style data platforms supporting research, risk, and live trading use cases
  • Developing Python-based pipelines for batch and streaming workloads
  • Working with Spark for large-scale processing and Kafka / Flink for real-time ingestion and stream processing
  • Modelling and optimising data for high-concurrency analytical access (OLAP)
  • Operating in a cloud-native AWS environment, with a strong focus on performance, data quality, and reliability
  • Partnering closely with quants and traders to deliver production-grade data solutions


What they’re looking for

  • 3+ years building serious data platforms in production
  • Strong Python and SQL (C++/Rust a plus, but not essential)
  • Hands-on experience with Spark and at least one streaming technology (Kafka and/or Flink)
  • Comfort working close to infrastructure and cloud services
  • Autonomous, structured problem-solver who enjoys high-impact work
  • Finance or trading exposure is helpful, but strong engineers from adjacent domains will be considered


This is a high-impact role with real ownership, strong engineering standards, and direct visibility into how data drives trading outcomes.


Please APPLY and I'll be in touch with more details.


Similar Jobs

Explore other opportunities that match your interests

Performance Marketing Analyst

Data Science
2w ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Agoda

Hong Kong Sar

Entry-Level Business Analyst

Data Science
1h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Entry level

ATC

United State
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Entry level

aloha browser

Cyprus

Subscribe our newsletter

New Things Will Always Update Regularly