We are seeking a Senior Data Architect to design, evolve, and implement a scalable data platform on Azure, working on data products aligned with business domains. The ideal candidate will have 4+ years of experience in Data Engineering/Analytics Engineering/Data Architecture, strong expertise in SQL, and hands-on experience with Azure, Databricks, and Apache Airflow. The role requires collaboration with cross-functional teams and shaping data governance across the organization.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
๐น Data Architect (Azure, Databricks, Data Platform & Products) ๐น
๐ We welcome international candidates based in LATAM who are open to relocating to Barcelona
๐ About the role
We are looking for a Data Architect to join a modern, cloud-based data platform team within an international environment.
In this role, you will contribute to the design, evolution, and implementation of a scalable data platform on Azure, while also working on data products aligned with business domains.
You will operate at the intersection of data engineering and architecture, translating business needs into robust, scalable, and well-governed data solutions. Depending on your experience, you will either drive architectural decisions end-to-end or contribute to the design and implementation of data products within a broader architecture.
You will collaborate with cross-functional teams (Data, Product, Business) and play a key role in shaping how data is structured, governed, and consumed across the organization.
If you enjoy working with modern data platforms, designing data models, and building impactful data solutions โ this could be a great fit.
๐ป What youโll do
๐น Design and evolve data architecture and data models within a Lakehouse environment (Azure + Databricks)
๐น Translate business requirements into scalable data products and domain-oriented models
๐น Work with Medallion Architecture (Bronze, Silver, Gold) and modern data modeling patterns (Kimball, SCDs, hierarchical data)
๐น Contribute to the design and implementation of data pipelines (batch & streaming) using tools like Data Factory and Airflow
๐น Ensure performance, scalability, and cost optimization of data workloads
๐น Support and/or lead the evolution of the data platform (Delta Lake, Unity Catalog, governance frameworks)
๐น Define and implement data governance standards, including data quality, metadata, lineage, and access control
๐น Collaborate with stakeholders to define data contracts, SLAs, and data quality expectations
๐น Contribute to data cataloging and metadata management (DataHub, Atlan, OpenMetadata, etc.)
๐น Work closely with engineering, product, and business teams to enable self-service analytics
Looking to advance your Data Science career with relocation support? Explore Data Science Jobs with Relocation Packages that include comprehensive packages to help you move and settle in your new role.
๐ก Must Have
๐น 4+ years of experience in Data Engineering / Analytics Engineering / Data Architecture
๐น Strong expertise in SQL and solid understanding of data modeling principles
๐น Hands-on experience with: Azure (Data Factory, Storage, cloud data services), Databricks (Spark, Delta Lake), Apache Airflow
๐น Strong understanding of: Lakehouse architecture and Medallion design, Data modeling patterns (dimensional modeling, SCDs, domain-driven approaches), ETL / ELT pipelines and data transformation
๐น Experience working with cloud-based data platforms
๐น Ability to think in terms of data products and data domains, not only datasets
๐น Strong problem-solving skills and ownership mindset
๐น Ability to collaborate with both technical and business stakeholders
๐น Fluent English
โจ Nice to Have
๐น Experience with Unity Catalog, data governance, and access control (RBAC, IAM)
๐น Exposure to data quality frameworks (Great Expectations, Soda)
๐น Experience with data catalog tools (DataHub, Atlan, OpenMetadata, Alation)
๐น Knowledge of Python / PySpark
๐น Experience with Salesforce, SAP, or other operational systems
๐น Background in AdTech or digital environments
๐น Experience with BI tools (Power BI, Tableau, Looker)
๐น Understanding of metadata-driven architectures and self-service analytics
๐ This role is based in Barcelona. Candidates must be willing to relocate.
Discover our full range of relocation jobs with comprehensive support packages to help you relocate and settle in your new location.
๐ Why join this project?
๐ค People first โ diverse and inclusive culture in an international environment.
๐ Modern cloud platforms and large-scale, global projects.
๐ High team stability and collaborative culture.
๐ โฌ1200 per year training budget and continuous learning opportunities.
๐ฐ Flexible compensation model.
๐ฉบ Private health insurance and benefits package.
โก Flexible working hours and hybrid model.
๐๏ธ Wellhub: fitness, wellness, and mental health support.
โฝ Football and paddle tennis teams sponsored by Capitole.
๐ฅณ Team buildings, global events, and strong tech communities.
โจ Want to know more about us? Click here and discover all the details.
๐ Curious about our culture? Check out what people are saying about us on Glassdoor.
๐ฌ We know that not every candidate will meet 100% of the requirements. If your profile doesnโt match perfectly but you believe you can add value, weโd still love to hear from you!
๐ Ready for the challenge? Apply now and be part of a global team driving cloud innovation and security.
Empowering People, Unlocking Innovation.
Information Security Notice
- The employee will have access to confidential information related to Capitole and the assigned project.
- Compliance with internal security and information protection policies is mandatory.
- NDA signature required.
Similar Jobs
Explore other opportunities that match your interests
tek straight llc
OtB Tech LLC