Data Engineer role involves designing, building, and maintaining data pipelines, dimensional models, and ETL/ELT processes. The role requires expertise in data quality, security, and governance, as well as experience with cloud-based data platforms and tools. The ideal candidate will have 5+ years of experience as a Data Engineer and/or Data Analyst.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Job Description
Role: Data Engineer (REMOTE)
! STRICTLY DO NOT APPLY IF YOU'RE NOT IN CANADA AND HAS NO WORK AUTHORIZATION !
! PLEASE READ BELOW ENTIRE MESSAGE !
(NOTE: We are considering Canadian Citizen and PR holders. For the candidate who are on Work Permits and awaiting PR invitation, we don’t provide T4 because this is contract job and the rate we provide is for incorporation. Hence, PR points are not applied, for example: if you’re applying PR under CEC class.)
Client: Government of Alberta
Incorporation Rate: $70 CAD/hr (Canada or Provincial incorporation is a MUST)
Respond by: 06 - April - 2026
Contract period: 05/25/2026 to 04/30/2027 with additional 24 months of extension.
Anticipated Interviews dates
• Two weeks after the posting halts. This is an estimate.
Scoring Methodology:
Qualifications- 20%
Other Mandatory Requirements- 20%
Interview - 50%
Pricing - 10%
Other Mandatory Requirements
Two (2) project examples must be provided for each proposed resource, which exemplify and demonstrate the proposed resource’s expertise in the selected service area (Data Engineering). Project examples need to be added to the bottom of the resume.
Questions 1 through 5 must be answered for each project example. The Evaluation Team must be able to determine which project any given answer relates to. Where the answer to a Question is the same for both projects, this must be clearly stated.
1. Provide an overview of the project/assignment the proposed resource or the proposed resource’s team was engaged in that demonstrates expertise in the selected service area and role. The overview should clearly describe the data problem being addressed and the proposed resource’s responsibilities from a Data Engineering perspective.
Interested in remote work opportunities in Data Science? Discover Data Science Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
2. Describe the sector(s) (i.e. public, private or other) the project/assignment served, including any data sensitivity, regulatory, or privacy considerations relevant to the work.
3. Identify the project/assignment size in dollar value (i.e. less than $100,000, less than $500,000, less than $1,000,000 or greater than $1,000,000).
4. Describe the approach for the design, development, mitigation of risk and delivery of the project/assignment, including any special considerations with respect to methodology or processes. In the context of Data Engineering, include how data pipelines, data quality, performance, reliability, and operational considerations were addressed. In providing a response, consider quality assurance and communication across the cross functional team.
5. Provide a list of specific skills, tools and/or technology used within the project/assignment, particularly those related to Data Engineering. Clearly identify the tools and technologies the proposed resource personally worked with.
SUBMISSION MUST INCLUDE:
• RESUME
• ALL REQUIRED EXPERIENCE MUST BE DESCRIBED IN RESUME UNDER THE JOB/PROJECT WHERE EXPERIENCE WAS ATTAINED.
• EACH JOB/PROJECT MUST CONTAIN THE TERM OF THE JOB/PROJECT IN THE FORMAT MMM/YYYY to MMM/YYYY.
• THREE REFERENCES, FOR WHOM SIMILAR WORK HAS BEEN PERFORMED, MUST BE PROVIDED. THE MOST RECENT REFERENCE SHOULD BE LISTED FIRST. REFERENCE CHECKS MAY OR MAY NOT BE COMPLETED TO ASSIST WITH SCORING THE PROPOSED RESOURCE.
SKILL MATRIX:
MUST HAVE WORK EXPERIENCE:
3+ years of Ensuring data quality, security, and governance.
5+ years of Experience as a Data Engineer and/or Data Analyst.
3+ years of Experience designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics.
3+ years of Experience developing and maintaining reports, dashboards, and visualizations using Power BI, DAX, Tableau, or Python libraries.
5+ years of Experience manipulating and extracting data from diverse on-premises and cloud-based sources.
3+ years of Experience performing migrations across on-premises, cloud, and cross-database environments.
2+ years of Experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions.
3+ years of Experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications.
2+ years of Experience in application development, with knowledge of object-oriented and functional programming/scripting languages.
1+ years of Experience in the Government of Alberta environment or an environment of equivalent size and complexity.
2+ years of Experience with databases and data integration, including PostgreSQL, MongoDB, Azure Cosmos DB and data intefration tools like Synapse pipeline, Fabric data factory, Informatica, Talend, DBT and Airbyte.
1+ years of Exposure to AI/ML tools and workflows relevant to data engineering, such as integrating AI-driven analytics or automation within cloud platforms like Databricks and Azure.
Job Description:
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Scope of Services:
Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:
Data Engineering:
• Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
• Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
• Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
• Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
• Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
• Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
• Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
• Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.
Data Analytics:
• Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
• Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
• Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
• Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
• Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
• Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
Equipment Requirements:
• Resource will require own equipment/laptop. The resource must provide their own computer and related equipment. The computer's operating system must be a modern version of Windows or macOS that is compatible with Azure Virtual Desktop (AVD) and other related software for remote access. Windows is preferred due to better compatibility. AVD and related software will be installed on the resource's computer.
Working Hours:
• Standard Hours of work are 08:15 – 16:30 Alberta time, Monday through Friday excluding holidays observed by the Province
• Work must be done from within Canada, due to network and data security issues.
• It is anticipated the role will be 100% remote, however in the event of an onsite meeting, the GoA does not pay for travel to attend on-site meetings, nor any expenses related to relocation, commuting, housing/accommodation, food/drink.
Notes on Location:
• Resource will work remotely.
Similar Jobs
Explore other opportunities that match your interests
RAZOR
Alignerr