We are looking for a Data Engineer and System Lead to build and evolve reliable, scalable data products that enable high-impact insights, accelerate decision-making, and improve patient outcomes. The role involves designing, building, and maintaining end-to-end data pipelines and integrations, ensuring operational excellence of the platform and data products. The ideal candidate will have strong engineering and ownership mindset, excellent communication skills, and experience with data engineering, analytics engineering, or similar roles.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
We are looking for a "Data Engineer and System Lead", for the Medicine & Analytics, Quality & Regulatory Department of our pharmaceutical client.
As a Data Engineer, you will build and evolve reliable, scalable data products that enable high‑impact insights, accelerate decision‑making, and improve patient outcomes.
In addition, you will ensure operational excellence of the platform and data products by safeguarding system stability, coordinating releases, and deploying changes across environments (DEV/QA/PROD). With a strong engineering and ownership mindset, you’ll drive change, embrace continuous improvement, and thrive in a fast‑moving environment where priorities evolve day by day.
Tasks & Responsibilities 🚀
Data Engineering & Delivery
- Design, build, and maintain end‑to‑end data pipelines and integrations to support Medicine & Analytics use cases (batch and event‑driven patterns where relevant).
- Develop, operate, and optimize integrations using SnapLogic and AWS services such as S3, AWS Lambda, and AWS Glue to ensure robust ingestion, transformation, and orchestration.
- Implement and maintain analytics‑ready data models in Snowflake, ensuring performance, scalability, and cost‑efficient design.
- Build transformation logic and analytics layers using dbt, including modular modeling, testing, documentation, and deployment best practices.
- Contribute to and enforce data governance standards by leveraging tools such as Collibra, ensuring metadata quality, lineage, ownership, and consistent definitions.
- Partner with Data Quality stakeholders to implement and monitor quality controls using Attaccama, including rules, profiling, exception handling, and remediation workflows.
- Support data lifecycle processes and operationalization of data products using Innovator (as applicable in the ecosystem) to align delivery with platform and product standards.
- Proactively identify opportunities to simplify architecture, automate repetitive work, and reduce operational effort (observability, alerting, self‑healing patterns).
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
- Own day‑to‑day system stability for Quality & Regulatory data products and integrations, ensuring availability, performance, and reliability in production.
- Plan, coordinate, and execute deployments and releases across environments (DEV → QA → PROD), including cutover activities, release notes, and stakeholder communication.
- Define and perform end‑to‑end validation and testing in QA (functional, regression, data quality, and reconciliation checks) and ensure readiness for production promotion.
- Manage change control activities in alignment with regulated expectations (e.g., documentation, traceability, approvals, segregation of duties, and audit readiness).
- Lead incident and problem management for the domain
- Degree in Computer Science, Engineering, Data/Information Systems, or a related field, with several years of relevant experience in data engineering, analytics engineering, or similar roles.
- Hands‑on experience building integrations and pipelines using tools such as SnapLogic (or comparable iPaaS) and cloud services — specifically AWS S3, Lambda, and Glue.
- Strong experience with Snowflake including data modeling, performance tuning, and secure data access patterns.
- Proven experience with dbt (models, tests, macros, documentation, environments, CI/CD integration).
- Familiarity with data governance and metadata management, ideally with Collibra; understanding of lineage, stewardship, and data catalog practices.
- Experience implementing data quality controls and monitoring, ideally with Attaccama (or equivalent tooling and approaches).
- Understanding of regulated environments and quality systems (e.g., GxP principles), including documentation, validation mindset, and audit readiness.
- Solid knowledge of software engineering fundamentals: Python/SQL, Git, coding standards, automated testing, and production support practices.
- Excellent communication skills in English (Spanish is a strong plus), enabling clear interaction with technical and non‑technical stakeholders.
- 3 years in a similar role
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
- 08/09h -17/18h from Monday to Friday (flexible)
- 100% remote with occasional visits to the office.
- Salary package based on your profile.
- Permanent Contract
- Learning & Development
Similar Jobs
Explore other opportunities that match your interests
Adentis Portugal
Senior Data Engineer for Web3 Payments Infrastructure
reown