Seeking a Senior Data Engineer to own and maintain the data infrastructure, focusing on reliability and incremental improvements. Responsibilities include Snowflake and dbt management, semantic layer development, and AI-augmented engineering. Requires 5+ years of experience and fluency in Romanian and English.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
We are looking for a Senior Data Engineer to take ownership of our data infrastructure, with a strong focus on maintenance, reliability, and incremental improvements across existing and new data projects.
The role is fully remote within Romania. You must speak Romanian and English (C1).
This is a full-time role, with ongoing responsibilities for pipeline maintenance and close integration into the core product and data team. You will inherit a mature data stack and be responsible for keeping it secure, scalable, and well-governed, while also contributing to new initiatives as they arise.
Key Responsibilities
Data Platform Ownership & Maintenance
- Own the day-to-day reliability and maintenance of data pipelines and the analytics stack.
- Ensure data freshness, correctness, and availability for downstream consumers.
- Proactively identify and resolve pipeline failures, performance issues, and data quality risks.
Snowflake & Data Warehouse Management
- Own ingestion workflows (e.g. AWS S3 to Snowflake), including copy commands, transformations, and monitoring.
- Support cost optimization and performance tuning within the warehouse.
Transformation & Modeling Layer
- Maintain and extend dbt models, including:
- Modeling standards and best practices
- Testing and validation logic
- Deployment workflows
- Own the semantic layer (Cube):
- Maintain metric definitions used by APIs and downstream tools
- Manage version control and deployments via GitHub
- Ensure consistency between warehouse models and exposed metrics
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
External Data & Tooling
- Own orchestration and configuration of external data tools (e.g. Fivetran, Hex):
- Connector setup and monitoring
- Sync schedules and dependencies
- Impact analysis of schema or source changes
- Act as technical owner for tool integrations, including API keys, service accounts, and vendor interactions.
Analytics & Product Data Support
- Support and maintain Metabase:
- Ownership of key collections and saved queries
- Data model alignment between Snowflake and end-user dashboards
- Collaborate on product analytics:
- Review and maintain Amplitude instrumentation plans
- Ensure event definitions are consistent and correctly synced into the warehouse
AI-Augmented Engineering
- Use LLMs and AI coding assistants as standard tools in your daily development workflow.
- Design, build, and maintain agentic workflows that automate repetitive data engineering tasks such as pipeline monitoring, anomaly detection, schema change analysis, and documentation generation.
- Evaluate and integrate new AI-powered tools into the data stack where they improve velocity, quality, or reliability.
Collaboration & Documentation
- Work closely with product, analytics, and engineering teams as a fully integrated team member.
- Maintain clear documentation for pipelines, models, permissions, and operational processes.
- Provide guidance and best practices to analysts and stakeholders consuming data.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Required Experience & Skills
- 5+ years of experience as a Data Engineer, with senior-level ownership of production data systems
- Strong hands-on experience with:
- Snowflake (performance, security, cost awareness)
- dbt (modeling, testing, deployments)
- Cloud data ingestion patterns (especially AWS S3 → warehouse)
- Experience with modern analytics stacks and tools such as:
- Fivetran or similar ELT tools
- Semantic layers (e.g. Cube or equivalent)
- BI tools (Metabase, Looker, etc.)
- Demonstrated experience using LLMs as part of a daily engineering workflow - not just casual use, but integrated into how you build, test, and ship.
- Solid understanding of data modeling, metrics governance, and analytics enablement
- Comfortable working in Git-based workflows and production environments
- Strong ownership mindset and ability to work independently in a part-time setup
Nice to Have
- Experience with product analytics platforms (e.g. Amplitude)
- Familiarity with cost optimization and warehouse scaling strategies
Suitable candidates will be contacted within 10 days or less. We run a fast and lean recruitment process.
Similar Jobs
Explore other opportunities that match your interests
affiliads agency
hj consulting
Senior Network Data Analyst