We are seeking a highly experienced Data Architect to design scalable data platforms and drive implementation independently in complex environments. The ideal candidate will bring strong leadership, deep technical expertise, and a bias for action. Key responsibilities include owning end-to-end data architecture design, leading and executing design of scalable data platforms, and developing and maintaining enterprise data models.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Job Title: Data Architect
Experience: 13 to 15 Years
Contract Duration: 6 Months
Work Mode: Remote
Work Timing: US Timing (10:30 PM IST cutoff for overlap)
Location: Remote / Open
Role Overview
We are seeking a highly experienced Data Architect with 7–10 years of relevant, proven architecture ownership who can hit the ground running from day one. This role demands an execution-focused architect who is equally comfortable designing scalable data platforms and driving implementation independently in complex, ambiguous environments. The ideal candidate will bring strong leadership, deep technical expertise, and a bias for action.
Mandatory Skills — Technical
•      ETL/ELT and SQL coding using cloud-based database solutions such as Azure SQL, Synapse, Redshift, Snowflake, or similar tools.
•      Design and develop star schema data models, and ETL/ELT jobs, to support use cases across typical business domains.
•      Design and develop using best-practice techniques across data modelling, table-driven control of transformation jobs, and dynamic ETL/ELT jobs that scale for expanding use cases.
•      Establish and maintain reporting dashboard solutions for enterprise-wide and executive analytics, including maintenance of back-end transformation layers and data pipelines.
•      Proficiency in programming languages such as Python, Java, or similar.
Key Responsibilities
Interested in remote work opportunities in Devops? Discover Devops Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
•      Own end-to-end data architecture design from conceptualization to deployment.
•      Lead and execute the design of scalable, performant data platforms on cloud infrastructure.
•      Develop and maintain enterprise data models, including star and snowflake schemas.
•      Build and optimize ETL/ELT pipelines for high-volume, complex data environments.
•      Collaborate with business stakeholders to translate requirements into robust data solutions.
•      Drive best practices in data modelling, pipeline orchestration, and data quality assurance.
•      Establish governance standards for data assets, metadata, and documentation.
•      Support and mentor junior data engineers and analysts as needed.
•      Provide executive-level reporting and analytical dashboards for business insights.
•      Proactively identify and resolve data architecture bottlenecks and inefficiencies.
Required Qualifications
•      13–15 years of overall experience in data engineering, data warehousing, or data architecture.
•      7–10 years of proven hands-on data architecture ownership in enterprise environments.
•      Hands-on expertise with cloud data platforms: Azure Synapse, Azure SQL, Snowflake, Amazon Redshift, or equivalent.
•      Strong SQL coding skills for complex transformations, performance tuning, and schema design.
•      Demonstrated experience with ETL/ELT tools and frameworks.
•      Proficiency in Python and/or Java for data pipeline development.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
•      Experience with enterprise reporting and BI tools (e.g., Power BI, Tableau, or similar).
•      Deep understanding of dimensional modelling, data vault, and relational design patterns.
•      Ability to work independently in ambiguous, fast-paced environments with minimal supervision.
Preferred Qualifications
•      Experience with data orchestration tools such as Apache Airflow, Azure Data Factory, or AWS Glue.
•      Exposure to real-time/streaming data architectures (Kafka, Spark Streaming, etc.).
•      Familiarity with DevOps / DataOps practices including CI/CD pipelines for data.
•      Experience working with cross-functional, globally distributed teams.
•      Cloud certifications (Azure, AWS, GCP) are a plus.
Engagement Details
Contract Typ: Contract — 6 Months
Work Hours: US Business Hours (workday ends at 10:30 PM IST to allow overlap for calls)
Work Mode: Fully Remote
Start: Immediate / ASAP
Note: Candidates must be available to attend calls/meetings during US business hours. The work schedule is structured to accommodate a meaningful overlap window ending at 10:30 PM IST.
Similar Jobs
Explore other opportunities that match your interests
Principal DevOps/SRE Engineer
ELSA, Corp
TalentBridge