Senior Data Engineer - Apache Airflow, Snowflake, and Apache Kafka

BrainRocket Spain
Visa Sponsorship Relocation
Apply
AI Summary

Design and implement scalable data pipelines using Apache Airflow, Snowflake, and Apache Kafka. Collaborate with cross-functional teams to understand data requirements. Ensure data integrity, reliability, and accessibility.

Key Highlights
Design and implement data models to support business requirements
Develop and maintain scalable data pipelines using Apache Airflow
Implement and maintain Kafka-based streaming data pipelines for real-time data processing
Technical Skills Required
Python SQL Apache Airflow Snowflake Apache Kafka Data warehouse and data modelling techniques
Benefits & Perks
Six additional days of undocumented sick leaves
Medical Insurance
Relocation package (tickets, staying in a hotel for up to 2 weeks, and visa relocation support for employees and their family members)

Job Description


❗️Please note that this role is office based for Valencia, Spain (Carrer de Catarroja, 13, 46940 Manises).

❗️We can provide relocation assistance if you're outside of the city or country.


We are seeking a highly skilled Data Engineer with expertise in managing, designing, and optimizing data pipelines utilizing Apache Airflow, Snowflake, and Apache Kafka.

This individual will play a pivotal role in architecting robust, scalable, and efficient data solutions, ensuring the integrity, reliability, and accessibility of our data infrastructure


Responsibilities:

  • Develop and implement data models to support business requirements, optimizing for performance and scalability;
  • Design, build, and maintain scalable data pipelines using Apache Airflow;
  • Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems;
  • Integration to third party databases and APIs;
  • Establish monitoring, alerting, and maintenance procedures to ensure the health and reliability of data pipelines;
  • Collaborate with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements.


Requirements:

  • Proficiency in Python, SQL, and experience with data manipulation and transformation;
  • Data warehouse and data modelling techniques;
  • Experience in designing, building, and maintaining complex data pipelines using Airflow;
  • Proven track record in data engineering roles, with a focus on designing and implementing scalable data solutions using Snowflake or Redshift;
  • In-depth understanding and practical experience in implementing Kafka-based streaming architectures for real-time data processing.


We offer excellent benefits, including but not limited to:

🏥 Six additional days of undocumented sick leaves;

🏥 Medical Insurance;

🥳 Birthdays, milestones and employee anniversaries celebrations;

🏢 Modern offices with snacks and all the essentials;

🎉 Social Club and more than 50 events per year;

🍳 Partial coverage of breakfasts and lunches;

💻 Learning and development opportunities and interesting, challenging tasks;

✈️ Relocation package (tickets, staying in a hotel for up to 2 weeks, and visa relocation support for our employees and their family members);

📚 Opportunity to develop language skills, with partial compensation for the cost of English;

📈 Competitive remuneration level with annual review;

🤝 Teambuilding activities.


Bold moves start here. Make yours. Apply today!


Subscribe our newsletter

New Things Will Always Update Regularly