Design and implement scalable data pipelines using AWS services. Build end-to-end AI models and agents from data preprocessing through deployment and monitoring. Lead technical delivery with client stakeholders and mentor engineering teams.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
Duration: 6+ month contract
Compensation: $75-95/hr
Location: 100% remote - EST hours
Key Responsibilities
- Design scalable data pipelines using AWS services including S3, Glue, Lambda, Kinesis, Redshift, Step Functions
- Build end-to-end AI models and agents from data preprocessing through deployment and monitoring
- Implement large-scale data processing solutions using platforms such as Databricks and Snowflake
- Apply Generative AI architectures including large language model integration and retrieval-augmented generation
- Establish CI/CD pipelines for data systems within DevOps environments
- Lead technical delivery with client stakeholders and mentor engineering teams
Interested in remote work opportunities in Development & Programming? Discover Development & Programming Remote Jobs featuring exclusive positions from top companies that offer flexible work arrangements.
Requirements
- 7+ years data engineering experience
- Experience with Amazon Web Services (AWS) data architecture
- Experience building AI models and GenAI architectures
- Skilled in Python programming
Browse our curated collection of remote jobs across all categories and industries, featuring positions from top companies worldwide.
Preferred Skills
- AWS Certified Solutions Architect or equivalent certification
- Databricks certification
- AWS SageMaker experience
Airflow; AWS (Glue, Kinesis, Lambda, Redshift, S3, Step Functions); CI/CD pipelines; Databricks; DevOps practices; Generative AI (LLMs, RAG); Python; Snowflake; SQL; Spark
Job ID: 26-00412
Similar Jobs
Explore other opportunities that match your interests
Software Asset Management Specialist
NTT DATA North America
sundayy