Senior DataOps Engineer (Snowflake & BigQuery)

kpi partners United State
Remote
Apply
AI Summary

We are seeking a highly technical Senior DataOps Engineer to ensure the end-to-end reliability, performance, security, and cost-efficiency of our modern data ecosystem. The ideal candidate will have hands-on experience administering Snowflake and/or BigQuery, and a deep understanding of query optimization, warehouse sizing, and cost controls.

Key Highlights
End-to-end reliability and performance of data ecosystem
Snowflake and BigQuery administration
Data integration and pipeline operations
Technical Skills Required
Snowflake BigQuery Fivetran DBT Apache Airflow Data Security Data Governance
Benefits & Perks
100% Remote work
Contract position for 12 months

Job Description


Title: Senior DataOps Engineer (Snowflake & BigQuery)

Location: 100% Remote – PST Hours (8 AM – 5 PM PST)

Job Type: Contract – 12 Months

Key Skills: Snowflake, Fivetran, DBT, Apache Airflow

Nice to Have: Data Security, Data Governance


About KPI Partners

KPI Partners is a 5 times Gartner-recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients.


About the Role:

We are seeking a highly technical, hands-on Senior DataOps Engineer responsible for the end-to-end reliability, performance, security, and cost-efficiency of our modern data ecosystem. This role serves as the technical guardian of our Snowflake and BigQuery platforms, ensuring seamless data ingestion, robust orchestration, and optimized transformations at scale.

The ideal candidate is not just a user of data platforms and tools, but a deep technical operator and tuner—someone who understands underlying architectures and proactively optimizes for performance, scalability, and cloud cost efficiency.


Key Responsibilities

1. Snowflake & BigQuery Administration

• Performance Optimization:

Proactively monitor and tune query performance by identifying bottlenecks and optimizing clustering strategies, search optimization, materialized views, and warehouse configurations.

• Security & Governance:

Design, implement, and maintain complex Role-Based Access Control (RBAC) models to enforce data security, governance, and least-privilege access.

• Cost Management:

Implement resource monitors, warehouse auto-scaling policies, and consumption tracking to optimize cloud spend and maximize ROI.

2. Data Integration & Pipeline Operations

• Data Ingestion:

Provision, manage, and monitor high-volume data ingestion pipelines using Fivetran, Informatica, and Rite Sync, ensuring reliability and data freshness.

• Orchestration:

Maintain, monitor, and troubleshoot complex workflows and DAGs in Apache Airflow to ensure timely and dependable data delivery.

• Transformations:

Support and optimize DBT models, ensuring efficient transformations, data quality checks, testing, and documentation.

3. Operational Excellence & Reliability

• Monitoring & Alerting:

Implement advanced monitoring and alerting for pipeline failures, data latency, SLA breaches, and platform health.

• Continuous Improvement:

Regularly assess and fine-tune ingestion, orchestration, and transformation layers to improve reliability, reduce latency, and enhance performance.

• Technical Collaboration:

Act as a senior technical peer, contributing to architecture reviews, design discussions, and deep technical problem-solving with cross-functional teams.


Required Qualifications

• Strong hands-on experience administering Snowflake and/or BigQuery in production environments

• Deep understanding of query optimization, warehouse sizing, and cost controls

• Proven experience with Apache Airflow, DBT, and modern ELT/ETL tools

• Experience operating and monitoring data ingestion platforms such as Fivetran and Informatica

• Strong knowledge of data security, RBAC, and governance models

• Ability to troubleshoot complex data pipeline and platform issues end-to-end


Preferred Qualifications

• Experience supporting large-scale, high-volume analytical workloads

• Familiarity with cloud cost optimization strategies in AWS, GCP, or Azure

• Strong SQL expertise and understanding of distributed data architectures

• Experience working in highly collaborative, fast-paced data engineering teams


Similar Jobs

Explore other opportunities that match your interests

Amazon Connect Engineer

Devops
2h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Oliver James

United State
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Bright Vision Technologies

United State

Amazon Connect Engineer

Devops
8h ago
Visa Sponsorship Relocation Remote
Job Type Contract
Experience Level Entry level

Oliver James

United State

Subscribe our newsletter

New Things Will Always Update Regularly