Design and operate scalable data pipelines, integrate network telemetry sources, and build reliable analytics backends for enterprise customers. Collaborate with network engineers and SREs to onboard telemetry sources and ensure data quality. Troubleshoot production incidents and perform performance tuning.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Primary Title: Data Engineer
About The Opportunity
A remote, US-focused role within the cloud infrastructure and enterprise networking analytics sector. The team builds scalable data platforms that ingest, process, and analyze large volumes of telemetry and operational data to power observability, security, and performance use-cases for enterprise customers.
We are hiring a hands-on Data Engineer who will design and operate data pipelines, integrate network telemetry sources, and build reliable analytics backends that support real-time and batch workloads.
Role & Responsibilities
- Design, implement, and maintain scalable ETL/streaming pipelines to ingest network telemetry and application logs (batch and real-time).
- Build and optimize data models, tables, and queries for analytics and ML feature stores to support observability and security use-cases.
- Integrate message brokers and streaming systems (Kafka) and ensure high-throughput, low-latency processing with Spark or similar engines.
- Collaborate with network engineers and SREs to onboard telemetry sources (NetFlow/sFlow, SNMP, syslog) and ensure data quality and schema evolution.
- Implement CI/CD for data pipelines, automate deployments, and maintain monitoring, alerting, and cost controls in cloud environments.
- Troubleshoot production incidents, perform performance tuning, and document runbooks and data contracts.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
- Python
- SQL
- Apache Kafka
- Apache Spark
- AWS
- Linux
- NetFlow
- SNMP
- Terraform
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
- 5+ years of professional experience building data pipelines, analytics platforms, or network telemetry solutions.
- Proven experience operating production streaming and batch systems in a cloud environment (AWS preferred).
- Bachelor's degree in Computer Science, Engineering, or equivalent practical experience.
- Fully remote, US-based role with flexible hours and asynchronous collaboration.
- Competitive compensation, health benefits, and annual professional development budget.
- High-autonomy engineering culture focused on ownership, learning, and measurable impact.
Skills: aws,sql,apache kafka,linux,apache spark,python
Similar Jobs
Explore other opportunities that match your interests
Wiraa
The Fountain Group