Senior Data Engineer - Property & Casualty Insurance

James Search Group United State
Remote
Apply
AI Summary

Design, build, and maintain upstream components of a modern data platform. Develop scalable data pipelines, implement data ingestion frameworks, and ensure data quality and governance. Collaborate with analytics engineers and business partners to deliver clean, structured upstream data.

Key Highlights
Design, build, and optimize scalable data pipelines for batch and real-time processing
Implement data ingestion frameworks including CDC from core systems, APIs, and third-party platforms
Develop and optimize Apache Spark jobs on Databricks, leveraging Delta Lake, DLT pipelines, and lakehouse architectures
Technical Skills Required
Python SQL Apache Spark Databricks Delta Lake Duck Creek Salesforce Workday AWS services (S3, Glue, Lambda) DevOps tools (GitHub, CI/CD)
Benefits & Perks
Competitive base salary ($130,000 – $175,000)
Performance-based bonus
Comprehensive benefits package
401(k) with company match
Generous PTO and wellness programs

Job Description


Senior Data Engineer (P&C Insurance)

Fully Remote | Full-Time

Base Salary: $130,000 – $175,000


Partnering with a Leading P&C Insurance Carrier

James Search Group is proud to partner with a highly rated Property & Casualty insurance carrier that is making significant investments in data engineering, analytics, and machine learning. The company is building a centralized Data & ML/AI organization that unites experts across data architecture, engineering, analytics, governance, and modeling—creating a unique opportunity for growth, collaboration, and innovation.


We are seeking a Senior Data Engineer to design, build, and maintain the upstream components of a modern data platform. From ingestion and real-time streaming to data quality frameworks, you’ll play a pivotal role in shaping the technical foundation of the company’s next-generation data environment. This is a hands-on role where you’ll be coding daily, mentoring peers, and collaborating closely with analytics engineers, data scientists, and business partners.


Office Locations (Optional Hybrid):

This is a remote-first position, but you can also work from one of the carrier’s multiple U.S. office locations.


What You’ll Do:

  • Design, build, and optimize scalable data pipelines for batch and real-time processing.
  • Implement data ingestion frameworks including CDC from core systems, APIs, and third-party platforms (Salesforce, Workday, Duck Creek, etc.).
  • Develop and optimize Apache Spark jobs on Databricks, leveraging Delta Lake, DLT pipelines, and lakehouse architectures.
  • Ensure data quality, lineage, and governance using Unity Catalog, CI/CD, and role-based access/security controls.
  • Partner with analytics engineers (dbt) to deliver clean, structured upstream data.
  • Mentor peers, contribute to architecture decisions, and foster a culture of craftsmanship.
  • Leverage AWS services (S3, Glue, Lambda, etc.) and DevOps tools (GitHub, CI/CD) for scalable, production-grade deployments.


What You Bring:

  • 5+ years of professional experience in data engineering, ideally within insurance or financial services.
  • Strong proficiency in Python, SQL, and Spark for building and optimizing pipelines.
  • Hands-on expertise with Databricks (Unity Catalog, Delta Lake, DLT pipelines) and/or Azure Data Services.
  • Strong knowledge of AWS data services, with the ability to adapt across cloud platforms (Azure, GCP).
  • Familiarity with modern data architectures (medallion, lakehouse, streaming).
  • Experience with GitHub, CI/CD pipelines, and testing frameworks.
  • A problem-solving mindset: balancing pragmatism with scalability, and a passion for working in collaborative teams.


What’s In It for You:

  • Competitive base salary ($120K–$175K)
  • Performance-based bonus
  • Comprehensive benefits package
  • 401(k) with company match
  • Generous PTO and wellness programs


This is a rare opportunity to join a ground-floor data transformation at a forward-thinking P&C insurer, working with the latest tools and approaches to build something lasting.


If you’re excited to design and deliver production-grade data systems that directly empower analytics, AI, and business outcomes—we’d love to connect.


Subscribe our newsletter

New Things Will Always Update Regularly