Join a high-impact data initiative as a Senior Data Architect, designing and implementing scalable cloud-native data architectures, and collaborating with data engineering and business stakeholder teams.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
The Details
❌ Not open to third-party agencies
📍 Fully remote within the US
📅 6-month contract with strong extension potential
✅ Must be authorized to work in the US
About the Role
We are looking for a Senior Data Architect to join a high-impact data initiative on a contract basis. This is a hands-on architecture role working within a modern cloud-native data environment, with real influence over design decisions and data strategy.
If you're an experienced Data Architect who thrives in fast-moving environments and wants to work on genuinely interesting problems across cloud platforms, semantic modeling, and AI/LLM integration — this is worth your time.
What You'll Be Doing
- Designing and implementing scalable, cloud-native data architectures across enterprise environments
- Leading Lakehouse and Medallion architecture design using Databricks and Delta Lake
- Building and optimizing semantic models to support business intelligence and advanced analytics
- Integrating AI and LLM capabilities into existing data pipelines and architecture
- Collaborating with data engineering, analytics, and business stakeholder teams
- Defining data modeling standards, best practices, and governance frameworks
- Advising on platform selection and cloud data strategy across Azure, AWS, or GCP
Interested in remote work opportunities in Devops? Discover Devops Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
What We're Looking For
Required:
- 7+ years of experience in data architecture or senior data engineering
- Strong hands-on experience with at least one major cloud platform — Azure (Synapse, Fabric, ADF), AWS (Redshift, Glue, Lake Formation), or GCP (BigQuery, Dataflow)
- Proven experience with Databricks and Delta Lake in production environments
- Semantic modeling expertise — dbt metrics layer, LookML, Microsoft Analysis Services, or Power BI semantic models
- Solid understanding of Lakehouse and Medallion architecture patterns
- Experience with data orchestration tools — Apache Airflow, dbt Core, or Kafka
- Strong Python or SQL skills
- Excellent communication skills with the ability to present to senior stakeholders
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Nice to Have:
- Hands-on AI/LLM experience — RAG pipelines, Azure OpenAI, LangChain, or vector databases (Pinecone, Weaviate)
- Relevant certifications — Databricks Certified, Azure DP-203, AWS Data Analytics
- Prior consulting experience
- Experience with Microsoft Fabric
Similar Jobs
Explore other opportunities that match your interests
nasscomm
Stage 4 Solutions
Cloud Engineer III - API Gateway