Senior Data Engineer

BrainRocket Malta
Relocation
Apply
AI Summary

Design and implement scalable data solutions using Apache Airflow, Snowflake, and Apache Kafka. Develop and maintain data pipelines, ensuring data integrity and reliability. Collaborate with cross-functional teams to meet data requirements.

Key Highlights
Develop and implement data models to support business requirements
Design, build, and maintain scalable data pipelines using Apache Airflow
Implement and maintain Kafka-based streaming data pipelines for real-time data processing
Collaborate with cross-functional teams to understand data requirements
Technical Skills Required
Python SQL Apache Airflow Snowflake Apache Kafka Data Warehouse Data Modelling
Benefits & Perks
24 days of vacation per year
Private Health Insurance
Daily Breakfasts and Friday lunches in the office
Snacks in the Office
Transport Allowance of up to 150 euro paid quarterly
Gym Subsidies for office building gym
Tennis or Paddle Training sessions reimbursement
Discount for Premium subscription for Cloudigo
Discounts at partners of the company
Annual salary reviews
Birthday gifts
Option for parking
Regular office & team building events

Job Description


❗️IMPORTANT: This position is office based for our office in Birkirkara, Malta.

❗️We can provide relocation assistance if you're outside of the city or country.


We are seeking a highly skilled Data Engineer with expertise in managing, designing, and optimizing data pipelines utilizing Apache Airflow, Snowflake, and Apache Kafka.

This individual will play a pivotal role in architecting robust, scalable, and efficient data solutions, ensuring the integrity, reliability, and accessibility of our data infrastructure


Responsibilities:

  • Develop and implement data models to support business requirements, optimizing for performance and scalability;
  • Design, build, and maintain scalable data pipelines using Apache Airflow;
  • Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems;
  • Integration to third party databases and APIs;
  • Establish monitoring, alerting, and maintenance procedures to ensure the health and reliability of data pipelines;
  • Collaborate with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements.


Requirements:

  • Proficiency in Python, SQL, and experience with data manipulation and transformation;
  • Data warehouse and data modelling techniques;
  • Experience in designing, building, and maintaining complex data pipelines using Airflow;
  • Proven track record in data engineering roles, with a focus on designing and implementing scalable data solutions using Snowflake or Redshift;
  • In-depth understanding and practical experience in implementing Kafka-based streaming architectures for real-time data processing.


We offer excellent benefits, including but not limited to:

✔️24 days of vacation per year;

✔️Private Health Insurance;

✔️Daily Breakfasts and Friday lunches in the office;

✔️Snacks in the Office;

✔️Transport Allowance of up to 150 euro paid quarterly (public transport or taxi);

✔️Gym Subsidies for office building gym;

✔️Tennis or Paddle Training sessions reimbursement, up to 100 EUR monthly;

✔️Discount for Premium subscription for Cloudigo;

✔️Discounts at partners of the company;

✔️Annual salary reviews;

✔️Birthday gifts;

✔️Option for parking;

✔️Regular office & team building events.


Bold moves start here. Make yours. Apply today!


Subscribe our newsletter

New Things Will Always Update Regularly