Senior ETL AWS Lead Engineer

BURGEON IT SERVICES โ€ข United State
Remote Relocation
Apply
AI Summary

We are seeking a Senior ETL AWS Lead Engineer to design, build, and optimize large-scale data pipelines and data platforms on AWS using PySpark and cloud-native services. The ideal candidate will have strong architecture, engineering best practices, and hands-on data pipeline development skills.

Key Highlights
Design and build large-scale data pipelines and data platforms on AWS
Optimize data pipelines and ETL workflows
Lead performance engineering and security compliance
Technical Skills Required
PySpark SparkSQL AWS services Kinesis DMS S3 RDS Redshift DynamoDB Glue EMR Athena SageMaker Bedrock EC2 Lambda ECS IAM KMS SSE Python REST APIs API Gateway Node.js GitHub Linux/Unix scripting JSON Parquet Avro Docker Kubernetes Delta Lake data quality frameworks
Benefits & Perks
Hybrid work arrangement
Relocation accepted

Job Description


Job Title: ETL AWS Lead Engineer (Technology Lead โ€“ Data on Cloud)

Location: Richardson, TX (Hybrid; relocation accepted)

Duration: 12 Months plus only on w2

Experience: 15+ Years

Certification: AWS Required

Please share me the resumes at pranay@burgeonits.com

Job Description:

Seeking a senior ETL AWS Lead Engineer to design, build, and optimize large-scale data pipelines and data platforms on AWS using PySpark and cloud-native services. Candidate must be strong in architecture, engineering best practices, and hands-on data pipeline development.

Required Skills:

  • 5+ years ETL & big data engineering using PySpark & SparkSQL
  • Strong SQL with query tuning (Oracle, SQL Server, Teradata)
  • AWS services experience across:
  • Ingestion: Kinesis, DMS
  • Storage: S3, RDS, Redshift, DynamoDB
  • Analytics/ML: Glue, EMR, Athena, SageMaker, Bedrock
  • Compute: EC2, Lambda, ECS
  • Security: IAM, KMS, SSE
  • Python, REST APIs (API Gateway, Node.js)
  • CI/CD using GitHub, Linux/Unix scripting
  • File formats: JSON, Parquet, Avro
  • Exposure to Docker/Kubernetes, Delta Lake, data quality frameworks

Nice to Have:

  • Team leadership & mentoring experience
  • Agile delivery and cross-team collaboration
  • Strong communication & stakeholder handling

Preferred:

  • BI tools (QuickSight, Tableau)
  • Data governance/security/compliance knowledge
  • Experience with performance & observability for data pipelines

Key Responsibilities:

  • Build and optimize AWS data pipelines and ETL workflows
  • Design AWS-based data lake/data warehouse models
  • Lead performance engineering & security compliance
  • Mentor team on AWS & ETL best practices
  • Support testing, RCA, and technical documentation


Similar Jobs

Explore other opportunities that match your interests

Staff DevOps Engineer

Devops
โ€ข
2h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

โ€ขโ€ขโ€ขโ€ขโ€ขโ€ข โ€ขโ€ขโ€ขโ€ขโ€ขโ€ข โ€ขโ€ขโ€ขโ€ขโ€ขโ€ข
Job Type โ€ขโ€ขโ€ขโ€ขโ€ขโ€ข
Experience Level โ€ขโ€ขโ€ขโ€ขโ€ขโ€ข

Northrop Grumman

United State

Amazon Connect Engineer

Devops
โ€ข
2h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Oliver James

United State
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Bright Vision Technologies

United State

Subscribe our newsletter

New Things Will Always Update Regularly