Design, build, and maintain large-scale data pipelines within a high-volume enterprise data environment. Support migration efforts from Teradata to Google Cloud Platform. Collaborate with stakeholders to support data needs.
Key Highlights
Key Responsibilities
Technical Skills Required
Benefits & Perks
Nice to Have
Job Description
STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
“Beware of scams. S3 never asks for money during its onboarding process.”
Job Title: Senior Database Engineer
Contract Length:18+ Month contract
Some on site work
Location: Charlotte, NC/ ISELIN, NJ 08830
Pay: 60 per hr on w2
We are seeking a Senior-level Database Engineer to design, build, and optimize large-scale data pipelines within a high-volume enterprise data environment. This role supports critical applications tied to fraud and claims analysis, working across legacy and modern cloud platforms.
The environment is undergoing a major transformation from Teradata to Google Cloud Platform (GCP), requiring hands-on engineering expertise in both existing and target-state architectures.
Searching for Development & Programming roles that provide visa sponsorship? Connect with international employers through Development & Programming Jobs with Visa Sponsorship opportunities actively seeking talented professionals.
Key Responsibilities
- Design, develop, and maintain scalable ETL/data pipeline solutions
- Work with large-scale datasets (hundreds of terabytes across hundreds of tables)
- Support migration efforts from Teradata to GCP (BigQuery-based ecosystem)
- Build and optimize pipelines using PySpark and ETL frameworks
- Collaborate with stakeholders across fraud and analytics teams to support data needs
- Ensure performance, reliability, and data quality across pipeline workflows
- Troubleshoot and resolve production issues in distributed data environments
- Work within scheduling and orchestration tools to manage pipeline execution
Required Qualifications
- 5+ years of data engineering or software engineering experience
- Strong expertise in:
- SQL
- ETL development
- PySpark
- Hands-on experience with:
- Autosys (job scheduling)
- Ab Initio
- Experience building and maintaining large-scale data pipelines
- Ability to work in hybrid environments (on-prem + cloud)
Explore our comprehensive directory of visa sponsorship jobs from employers worldwide who are ready to sponsor talented international professionals.
Preferred Qualifications
- Experience with Google Cloud Platform (GCP), especially BigQuery
- Prior experience with Teradata
- Familiarity with Hadoop ecosystem
- Exposure to tools such as Dremio and distributed storage systems
- Cloud certifications (GCP preferred)
Interested in opportunities specifically in United State? Discover our dedicated Visa Sponsorship Jobs in United State page featuring roles from top employers in this location.
Technical Environment
- Current: Teradata-based platform
- Target: GCP (BigQuery ecosystem)
- Tools & Technologies:
- PySpark
- Hadoop
- Ab Initio
- Autosys
- Dremio
- S3-compatible storage systems
Similar Jobs
Explore other opportunities that match your interests
Malware and Countermeasures Unit (MCU) Reverse Engineer
Palo Alto Networks
Senior Malware Detection Engineer
palo alto networks unit 42
Senior Workday Financials Developer