Senior Cloud Data Architect

Frederick Fox United State
Remote
Apply
AI Summary

Frederick Fox is hiring a Senior Cloud Data Architect to design and build cloud-native data platforms for complex healthcare data environments. This role requires hands-on ownership of data pipelines at scale and expertise in Apache Airflow. The ideal candidate will have strong Python and SQL skills and experience with AWS.

Key Highlights
Architect and operate scalable data platforms and ETL/ELT pipelines in AWS
Own Apache Airflow end-to-end: DAG design, orchestration, deployment, monitoring, and troubleshooting
Ensure data quality, performance, security, and regulatory compliance (HIPAA, CMS)
Key Responsibilities
Architect and operate scalable data platforms and ETL/ELT pipelines in AWS
Own Apache Airflow end-to-end: DAG design, orchestration, deployment, monitoring, and troubleshooting
Build and optimize pipelines using Python, SQL, and cloud data stores
Technical Skills Required
Apache Airflow Python SQL AWS
Benefits & Perks
unlimited PTO
fully remote work
100% remote
Nice to Have
Background in healthcare, insurance, or TPA data (EDI is a strong plus)

Job Description


Senior Cloud Data Architect (fully remote, unlimited PTO)

End-to-end ownership of Apache Airflow in production is required.


This role requires direct, production ownership of Apache Airflow. Experience limited to writing DAGs within a managed or lightly used environment will not be a fit.


We’re hiring a Senior Cloud Data Architect to design, build, and operate cloud-native data platforms supporting complex healthcare data environments. This is a hands-on senior role for someone who has personally owned data pipelines at scale.


What you’ll do

  • Architect and operate scalable data platforms and ETL/ELT pipelines in AWS
  • Own Apache Airflow end-to-end: DAG design, orchestration, deployment, monitoring, and troubleshooting
  • Build and optimize pipelines using Python, SQL, and cloud data stores
  • Ensure data quality, performance, security, and regulatory compliance (HIPAA, CMS)
  • Work hands-on with claims, eligibility, EDI, and healthcare data integrations


What we’re looking for

  • 10+ years in data engineering, cloud engineering, or data architecture
  • Expert-level, hands-on Airflow ownership in production environments
  • Strong Python and SQL skills
  • Deep experience with AWS (data services, infrastructure, security)
  • Background in healthcare, insurance, or TPA data (EDI is a strong plus)


Similar Jobs

Explore other opportunities that match your interests

Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Director

mission, a cdw company

United State
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

missing-link.io

United State

Principal Infrastructure Engineer

Devops
19h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Voxel51

United State

Subscribe our newsletter

New Things Will Always Update Regularly