Join Nova Kore as a Senior Data Engineer to build and scale Snowflake-native data and ML pipelines, leveraging Cortex's emerging AI/ML capabilities. Design, build, and maintain DBT models, macros, and tests. Collaborate with data scientists and ML engineers to produce Cortex workloads in Snowflake.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
I’m helping Nova Kore find a top candidate to join their team flexible for the role of Senior Data Engineer - DBT, Snowflake & Cortex CLI.
You will architect AI-driven data pipelines, shaping the future of Snowflake ML.
Compensation:
USD 35 - 70/hour
Location:
Remote: India
Mission of Nova Kore:
"Connecting people and companies to create meaningful, transformative career and business opportunities."
What makes you a strong candidate:
- You have +3 years experience in Testing, Snowflake, Performance tuning.
- You are proficient in SQL, Python, Prefect, Data engineering, Dagster.
- English - Conversational
Responsibilities and more:
Senior Data Engineer / Analytics Engineer (India-Based). Partnering with a cutting-edge AI research lab to hire a Senior Data/Analytics Engineer with expertise across DBT and Snowflake’s Cortex CLI. In this role, you will build and scale Snowflake-native data and ML pipelines, leveraging Cortex’s emerging AI/ML capabilities while maintaining production-grade DBT transformations. You will work closely with data engineering, analytics, and ML teams to prototype, operationalise, and optimise AI-driven workflows, defining best practices for Snowflake-native feature engineering and model lifecycle management. This is a high-impact role within a modern, fully cloud-native data stack.
Responsibilities
• Design, build, and maintain DBT models, macros, and tests following modular data modeling and semantic best practices.
• Integrate DBT workflows with Snowflake Cortex CLI, enabling:
* Feature engineering pipelines.
* Model training and inference tasks.
* Automated pipeline orchestration.
* Monitoring and evaluation of Cortex-driven ML models.
• Establish best practices for DBT–Cortex architecture and usage patterns.
• Collaborate with data scientists and ML engineers to produce Cortex workloads in Snowflake.
• Build and optimise CI/CD pipelines for DBT (GitHub Actions, GitLab, Azure DevOps).
• Tune Snowflake compute and queries for performance and cost efficiency.
• Troubleshoot issues across DBT artifacts, Snowflake objects, lineage, and data quality.
• Provide guidance on DBT project governance, structure, documentation, and testing frameworks.
Required Qualifications
• 3+ years of experience with DBT Core or DBT Cloud, including macros, packages, testing, and deployments.
• Strong expertise with Snowflake (warehouses, tasks, streams, materialised views, performance tuning).
• Hands-on experience with Snowflake Cortex CLI, or strong ability to learn it quickly.
• Strong SQL skills; working familiarity with Python for scripting and DBT automation.
• Experience integrating DBT with orchestration tools (Airflow, Dagster, Prefect, etc.).
• Solid understanding of modern data engineering, ELT patterns, and version-controlled analytics development.
Nice-to-Have Skills
• Prior experience operationalising ML workflows inside Snowflake.
• Familiarity with Snowpark, Python UDFs/UDTFs.
• Experience building semantic layers using DBT metrics.
• Knowledge of MLOps/DataOps best practices.
• Exposure to LLM workflows, vector search, and unstructured data pipelines.
Why Join
• You will be an hourly contractor through Mercor, working 20–40 hours per week with flexibility.
• Direct opportunity to build next-generation Snowflake AI/ML systems with Cortex.
• High-impact ownership of DBT and Snowflake architecture across production pipelines.
• Work alongside top-tier ML engineers, data scientists, and research teams.
• Fully remote, high-autonomy environment focused on innovation, velocity, and engineering excellence.