InnoVet Health seeks an experienced Sr. Azure Data Engineer to support healthcare analytics, interoperability, and data integration needs. The successful candidate will design and develop scalable data pipelines, ensure data quality, and support advanced analytics across clinical, operational, and regulatory datasets. Strong skills in PySpark, Delta Lake, SQL, and distributed data processing are essential.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
InnoVet Health, a small and growing business that provides health IT professional services to the Department of Veterans Affairs (VA) is looking for an experienced Sr. Azure Data Engineer proficient in Databricks, Delta Lake, and medallion/lakehouse architectures to support healthcare analytics, interoperability, and data integration needs. You will build scalable data pipelines, ensure data quality, and support advanced analytics across clinical, operational, and regulatory datasets. Your work will directly impact VA healthcare delivery by building an analytical infrastructure that enables AI/ML and advanced analytics to deliver short-term and longitudinal insights to healthcare professionals serving Veterans. You will work in an agile environment alongside VA and contractor stakeholders. It is flexible, full-time, and does not require relocation (work-from-home). The pay, benefits, and growth potential are competitive.
Responsibilities
- Gather and translate business, technical, and functional requirements into data architecture and pipeline design decisions. Design and develop Azure Data Factory and Databricks-based ETL/ELT pipelines using PySpark, Delta Lake, and medallion/lakehouse architecture.
- Ingest and transform healthcare data (clinical, claims, FHIR, HL7, EHR, ADT, PGHD) from diverse sources.
- Build secure, scalable solutions using Azure Data Lake Storage, ADF, and Event Hubs, and related services, with attention to latency and reliability requirements.
- Implement data quality, lineage, and governance using Microsoft Purview.
- Optimize Databricks jobs (performance tuning, cluster sizing, Z-ordering, partitioning).
- Enforce HIPAA-aligned security practices: RBAC, Key Vault, private endpoints, PHI protection.
- Collaborate with data scientists, analysts, and clinical informatics teams.
- Stay up to date with emerging technologies and trends in data engineering and healthcare data management.
- Present and discuss results with IT and business stakeholders.
- Participate in company growth and other responsibilities, as assigned.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Qualifications
- Bachelor’s or master’s degree in computer science, data analytics, or related field.
- Minimum 6+ years data engineering experience; 4+ years hands-on with Azure and 2+ years hands-on with Databricks.
- Strong skills in PySpark, Delta Lake, SQL, and distributed data processing.
- Experience with healthcare data standards (FHIR, HL7, X12/EDI, CCD, claims data, PGHD).
- Strong understanding of HIPAA, PHI handling, and secure data architecture.
- Experience with ADF, ADLS Gen2, Azure Functions, and event-driven ingestion.
- Strong understanding of data modeling for analytics (dimensional + lakehouse).
- Excellent problem-solving, collaboration and communication skills.
- Green card or US citizen required because of government contract work.
- No 1099 or corp-to-corp or international outsourcing or staffing agencies.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Preferred
- Experience with Federal EHR (VistA and Oracle Health) data.
- Experience with Azure Event Hubs, Stream Analytics, AWS Kinesis, or similar data streaming platforms is also a plus.
Similar Jobs
Explore other opportunities that match your interests
Lensa
Wiraa