Seeking an AWS Data Engineering Architect to design and implement scalable data solutions on AWS for a global Fortune 500 food services company. Requires 7+ years of experience, strong Python and SQL skills, and expertise in AWS services like S3, DynamoDB, Lambda, and Glue. Responsibilities include building data pipelines, designing cloud-native architectures, and ensuring data security.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
[W2 ONLY]
[REMOTE]
[NO IMMIGRATION SPONSORSHIP]
AWS Data Engineer - Architect / Charlotte, NC or REMOTE / CTH
About our Customer & Role:
Our direct customer, a global Fortune 500 company & a leader in the Food Services industry is looking for a “Data Engineering Architect”, who will be responsible for designing and implementing robust, scalable, and high-performing data solutions on AWS. Ensures cloud data infrastructure meets the needs of growing organization.
KEY SKILLS: Python, Angular/TypeScript, AWS services (S3, PostgreSQL, DynamoDB, Athena, Snowflake, Lambda, and Glue).
Qualifications:
- 7+ years of experience in data architecture, engineering, or similar roles.
- Very strong programming skills in Python.
- Expertise in ETL or Data Engineering role building and implementing data pipelines.
- Strong understanding of design best practices for OLTP systems, ODS reporting needs, and dimensional database practices.
- Hands-on experience with AWS Lambda, AWS Glue, and other AWS services.
- Proficient in Python and SQL with the ability to write efficient queries.
- Experience with API-driven data access (API development experience a plus).
- Solid experience with database technologies (SQL, NoSQL) and data modeling.
- Understanding of serverless architecture benefits and challenges.
- Experience working in agile development environments.
- AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect) are highly desirable.
Interested in remote work opportunities in Development & Programming? Discover Development & Programming Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Preferred Skills:
- Experience with modern data stack technologies (e.g., DBT, Snowflake, Databricks).
- Familiarity with machine learning pipelines and AI-driven analytics.
- Background in DevOps practices and Infrastructure as Code (IaC) using tools like Terraform or AWS CloudFormation.
- Knowledge of CI/CD pipelines for data workflows.
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Responsibilities:
- Define, build, test, and implement scalable data pipelines.
- Design and implement cloud-native data architectures on AWS, including data lakes, data warehouses, and real-time data processing pipelines.
- Perform data analysis required to troubleshoot data-related issues and assist in the resolution of data issues.
- Collaborate with development, analytics, and reporting teams to develop data models that feed business intelligence tools.
- Design and build API integrations to support the needs of analysts and reporting systems.
- Develop, deploy, and manage AWS Lambda functions written in Python.
- Develop, deploy, and manage AWS Glue jobs written in Python.
- Ensure efficient and scalable serverless operations.
- Debug and troubleshoot Lambda functions and Glue jobs.
- Collaborate with other AWS service teams to design and implement robust solutions.
- Optimize data storage, retrieval, and pipeline performance for large-scale distributed systems.
- Ensure data security, compliance, and privacy policies are integrated into solutions.
- Develop and maintain technical documentation and architecture diagrams.
- Stay current with AWS updates and industry trends to continuously evolve the data architecture.
- Mentor and provide technical guidance to junior team members and stakeholders.
Similar Jobs
Explore other opportunities that match your interests
Jobs via Dice
Lensa