Design and implement robust Data Platform solutions for a healthcare data-focused enterprise. Collaborate with cross-functional agile teams to drive data architecture decisions and implement best practices. Develop and maintain data models, ensuring they align with business objectives and data privacy regulations.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Must-have:
- 7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
- Expert level experience working in Databricks and AWS
- Advanced level experience working in both relational and non-relational databases such as SQL Server and PostgreSQL
- Experience building and managing solutions on AWS
- Advanced with building and deploying IAC using terraform, asset bundles and github
- Advanced in building out data models, data warehouses, designing of data lakes for enterprise (and product use)
- Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions.
- Experience in performance tuning, query optimization, security, monitoring, and release management.
- Experience working with and managing large, disparate, identified, and de-identified data sets from multiple data sources
Plus:
- Bachelor's degree or master's degree in computer science, data engineering or related field
- Experience managing and standardizing clinical data from structured and unstructured sources
- Health and Life Insurance business experience
- Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
- Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
- Associate or Professional level solution architecture certification in Azure and/or AWS
- Experience in Snowflake
- Experience in Spark
- Experience with Salesforce Integration
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Day-to-Day:
Insight Global is seeking a Sr. Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Sr. Data Engineer of our client’s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client’s projects. Responsibilities include:
- Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise.
- Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality
- Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
- External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
- ETL: Building solutions within Delta Live Tables and automation of transformations
- Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
- Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
- Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
- Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
- Costs: Working with the business to build cost effective and cost transparent Data solutions
- Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
- Experience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.io
- Identify and implement improvements to enhance data processing efficiency
- Experience with building out effective pipeline monitoring solutions
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ‘big data’ technologies.
- Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data
- Assemble large, complex data sets that meet functional and non-functional business requirements
- Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
- Collaboration: Partner internally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
- Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions.
- Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.
- Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
- Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.
- Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Similar Jobs
Explore other opportunities that match your interests
Senior Data Scientist
adelaide
Jobs via Dice