Senior Data Engineer and Developer

afterman software • United State
Remote
Apply
AI Summary

We are seeking a talented and experienced Data Engineer and Developer to play a pivotal role in delivering impactful insights by designing and building data pipelines, integrating multiple platforms, and solving challenging data problems.

Key Highlights
Design and build data pipelines using Azure Data Factory, DBT, and Microsoft Fabric
Integrate multiple platforms, including Snowflake, Salesforce, and nCino
Solve challenging data problems to support business growth
Key Responsibilities
Design, develop, and maintain robust and scalable data pipelines
Develop integration solutions between Snowflake, Salesforce, Workday, and nCino
Monitor and maintain production ETL processes and resolve support issues
Technical Skills Required
Azure Data Factory DBT Microsoft Fabric Snowflake Salesforce nCino SQL PowerShell Python
Benefits & Perks
120k/yr-155k/yr salary range
100% remote work
Flexible work arrangement

Job Description


About Our Team

The Afterman Software team is made up of problem solvers hungry to perfect new products and systems. We work 100% remotely, with team members across the US. Although we are in separate places, we still make an effort to know one another and have fun! We collaborate, help and mentor each other, and check in on our progress and blocks frequently. We work together primarily with Slack, GitHub, JIRA, Teams, and Zoom. We are boutique consultancy, specializing in solving the toughest problems on high-scale, high-throughput systems in a variety of industries, including finance, healthcare, eCommerce, retail, insurance, supply chain, logistics, and eLearning.


What We’re Looking for

We are seeking a talented and experienced Data Engineer and Developer to play a pivotal role in delivering impactful insights by designing and building data pipelines, integrating multiple platforms (Snowflake, Salesforce, nCino), developing data solutions, and solving challenging data problems to support our rapidly growing business.


Responsibilities:

  • Design, develop, and maintain robust and scalable data pipelines using Azure Data Factory, DBT, and Microsoft Fabric to stage and integrate data into our Enterprise Data Warehouse.
  • Develop integration solutions between Snowflake, Salesforce, Workday, and nCino, ensuring data consistency, reliability, and compliance with business needs.
  • Proactively identify opportunities to reduce data flow complexity and risk while proposing innovative, best-in-class solutions.
  • Implement data engineering solutions following guiding principles and best practices when detailed requirements are unavailable.
  • Monitor and maintain production ETL processes and resolve support issues, ensuring operational excellence and performance.
  • Develop objects (tables, views, stored procedures, triggers) in Snowflake along with tasks to support data processes.
  • Work with BA’s and the business to refine requirements and deliver against those requirements within the timeframe committed.
  • Manage work tasks and update status on work tasks on ADO (Azure DevOps), communicate status and updates to management, stakeholders and peers to deliver work products on time.

Qualifications:

  • 5–7 years of experience as a data engineer supporting enterprise data warehouse solutions.
  • Hands-on experience with Azure Data Factory, DBT, and Microsoft Fabric for developing modern ETL/ELT pipelines.
  • Experience with Talend, and PowerBI
  • Experience at banking institutions or within the financial services industry.
  • Hands-on experience with Snowflake for data warehousing and analytics.
  • Proven experience integrating enterprise systems, particularly Salesforce and nCino.
  • Proficient in SQL, PowerShell, and Python for data transformation and scripting.
  • Demonstrated critical thinking mindset with the ability to independently drive end-to-end solutioning, even in the absence of detailed requirements.
  • Strong knowledge of data governance/controls, data architecture, and data modeling principles.
  • Excellent communication and collaboration skills to translate business requirements into scalable technical solutions.
  • Strong analytical and problem-solving abilities, with a focus on alignment with best practices and standards.
  • Ability to work across multiple concurrent projects in a fast-paced environment.
  • Experience creating scalable ETL solutions using tools such as ADF, DBT, SSIS, or Microsoft Fabric.
  • Familiarity with Agile development methodologies and CI/CD pipelines.

Compensation:

  • 120k/yr-155k/yr

Similar Jobs

Explore other opportunities that match your interests

Regional Staffing Manager

Data Science
•
6h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

US Foods

United State

Senior Data Engineer

Data Science
•
20h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Insight Global

United State

Senior Data Scientist

Data Science
•
21h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

adelaide

United State

Subscribe our newsletter

New Things Will Always Update Regularly