Data Engineer

marcura United Kingdom
Remote
Apply
AI Summary

Marcura is seeking a Data Engineer to join their high-impact team and contribute to the success of one of the sector's most forward-looking organisations. The ideal candidate will have domain expertise in data engineering, including ETL process using modern tools and methodologies. They will play a key role in building scalable data structures, with a specific focus on implementing Data Vault 2.0 to ensure a flexible and auditable data foundation.

Key Highlights
Data engineering best practices
Data transformation in DBT
Data extraction and warehousing in BigQuery
Technical Skills Required
Data Vault 2.0 DBT Stitch Fivetran Segment Apache Airflow Google BigQuery Cloud Composer Python GCP Infrastructure
Benefits & Perks
Competitive Salary and Bonus
Inclusive Onboarding Experience
Marcura Wellness Zone
Global Opportunities
Diverse, Supportive Work Culture

Job Description


About Marcura:

Marcura is a global leader in maritime technology and operations, supporting nearly one‑third of the world's seaborne commodity trade. Our trusted platforms which span software, data intelligence and payments sit at the centre of digital transformation across the maritime industry.  We are now seeking a Data Engineer to join our high‑impact team and contribute to the success of one of the sector's most forward‑looking organisations.

About the Role:

To bring domain expertise in data engineering to the team, including the ETL process using modern tools and methodologies. You will play a key role in building scalable data structures, with a specific focus on implementing Data Vault 2.0 to ensure a flexible and auditable data foundation.

Roles and Responsibilities:

1. Data engineering best practices

  • You will contribute to the data team's ability to adhere to data engineering best practices across pipeline design, data quality monitoring, storage, versioning, security, testing, documentation, cost, and error handling.

2. Data transformation in DBT

  • Ensure that the daily DBT build is successful, including full test coverage of existing models.
  • Create new data models in collaboration with the data analysts, utilizing Data Vault 2.0 principles where appropriate to handle complex data relationships and historical tracking.
  • Add new tests to enhance data quality and maintain the integrity of the data warehouse.
  • Incorporate new data sources into the warehouse architecture.

3. Data extraction

  • Develop and maintain our data pipelines in Stitch, Fivetran, Segment, and Apache Airflow (Google Cloud Composer).
  • Evaluate when it's appropriate to use managed tools versus building custom data pipelines in Cloud Composer.
  • Ensure that data extraction jobs run successfully daily.
  • Collaborate with engineers from MarTrust to add new data sets to our data extraction jobs.

4. Data warehousing in BigQuery

  • Ensure that the data in our data warehouse is kept secure and that daily jobs in BigQuery run successfully.
  • Support the evolution of our BigQuery schema to accommodate Data Vault 2.0 structures (Hubs, Links, and Satellites) for long-term scalability.

5. Data Governance and Security

  • Data Quality (DQ): Implement and monitor automated data quality checks and observability to ensure the accuracy and reliability of downstream reporting.
  • Access Control: Manage and enforce granular access control policies (IAM) within BigQuery and GCP to ensure data is only accessible to authorized users.
  • Governance: Ensure all data processes comply with security standards and data privacy regulations, maintaining clear documentation of lineage and metadata.

Requirements


  • Data Modeling: Solid understanding and hands-on experience with Data Vault 2.0 methodologies.
  • GCP Infrastructure: Experience with Google BigQuery and Cloud Composer (Apache Airflow).
  • Modern Data Stack: Proficiency in DBT for data transformation and data quality testing.
  • Governance & Security: Practical experience managing data access controls, security best practices, and DQ frameworks.
  • Pipeline Tools: Experience with managed ELT services like Fivetran, Stitch, or Segment.
  • Remote Work: Ability to work effectively in a fully remote, distributed team environment.

Benefits

Competitive Salary and Bonus: We reward your expertise and contributions.

Inclusive Onboarding Experience: Our onboarding program is designed to set you up for success right from day one.

Marcura Wellness Zone: We value your work-life balance and well-being.

Global Opportunities: Be part of an ambitious, expanding company with a local touch.

Diverse, Supportive Work Culture: We're committed to inclusion, diversity, and a sense of belonging for all team members.


Similar Jobs

Explore other opportunities that match your interests

Junior Data Analyst

Data Science
39m ago
Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Entry level

codevista solution

United Kingdom

Analytics Engineer

Data Science
19h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Associate

Decentralized Masters

United Kingdom
Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Mid-Senior level

Leap29

United Kingdom

Subscribe our newsletter

New Things Will Always Update Regularly