Design, build, and maintain the upstream components of a modern data platform for a leading P&C insurance carrier. The position involves coding daily, mentoring peers, and collaborating with analytics engineers and business partners. Key requirements include strong proficiency in Python, SQL, and Spark, with hands-on expertise in Databricks and/or Azure Data Services.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Job Description
Senior Data Engineer (P&C Insurance)
Fully Remote | Full-Time
Base Salary: $120,000 – $170,000
Partnering with a Leading P&C Insurance Carrier
James Search Group is proud to partner with a highly rated Property & Casualty insurance carrier that is making significant investments in data engineering, analytics, and machine learning. The company is building a centralized Data & ML/AI organization that unites experts across data architecture, engineering, analytics, governance, and modeling—creating a unique opportunity for growth, collaboration, and innovation.
We are seeking a Senior Data Engineer to design, build, and maintain the upstream components of a modern data platform. From ingestion and real-time streaming to data quality frameworks, you’ll play a pivotal role in shaping the technical foundation of the company’s next-generation data environment. This is a hands-on role where you’ll be coding daily, mentoring peers, and collaborating closely with analytics engineers, data scientists, and business partners.
Office Locations (Optional Hybrid):
This is a remote-first position, but you can also work from one of the carrier’s multiple U.S. office locations.
What You’ll Do:
- Design, build, and optimize scalable data pipelines for batch and real-time processing.
- Implement data ingestion frameworks including CDC from core systems, APIs, and third-party platforms (Salesforce, Workday, Duck Creek, etc.).
- Develop and optimize Apache Spark jobs on Databricks, leveraging Delta Lake, DLT pipelines, and lakehouse architectures.
- Ensure data quality, lineage, and governance using Unity Catalog, CI/CD, and role-based access/security controls.
- Partner with analytics engineers (dbt) to deliver clean, structured upstream data.
- Mentor peers, contribute to architecture decisions, and foster a culture of craftsmanship.
- Leverage AWS services (S3, Glue, Lambda, etc.) and DevOps tools (GitHub, CI/CD) for scalable, production-grade deployments.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
What You Bring:
- 3-5+ years of professional experience in data engineering, ideally within insurance or financial services.
- Strong proficiency in Python, SQL, and Spark for building and optimizing pipelines.
- Hands-on expertise with Databricks (Unity Catalog, Delta Lake, DLT pipelines) and/or Azure Data Services.
- Strong knowledge of AWS data services, with the ability to adapt across cloud platforms (Azure, GCP).
- Familiarity with modern data architectures (medallion, lakehouse, streaming).
- Experience with GitHub, CI/CD pipelines, and testing frameworks.
- A problem-solving mindset: balancing pragmatism with scalability, and a passion for working in collaborative teams.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
What’s In It for You:
- Competitive base salary ($120K–$170K)
- Performance-based bonus
- Comprehensive benefits package
- 401(k) with company match
- Generous PTO and wellness programs
This is a rare opportunity to join a ground-floor data transformation at a forward-thinking P&C insurer, working with the latest tools and approaches to build something lasting.
If you’re excited to design and deliver production-grade data systems that directly empower analytics, AI, and business outcomes—we’d love to connect.
Similar Jobs
Explore other opportunities that match your interests
Wiraa
The Fountain Group