Databricks Platform Engineer

Signify Technology • United State
Remote Relocation
Apply
AI Summary

Design and implement technical solutions on the Lakehouse platform (Databricks) to support Data Products strategy and Data & AI ecosystem. Collaborate with stakeholders and teams to ensure alignment with business objectives and deliver secure, scalable, and reliable solutions.

Key Highlights
Collaborate with stakeholders and teams
Design and implement technical solutions on Databricks
Support Data Products strategy and Data & AI ecosystem
Technical Skills Required
Databricks PySpark Python SQL Git ServiceNow Dynatrace Stonebranch Control-M Airflow Great Expectations Monte Carlo
Benefits & Perks
Remote work
Relocation to Abu Dhabi available

Job Description


Job title: Databricks Platform Engineer

Job type: B2B

Contract Length: 12 months

Role Location: Remote / Relocation to Abu Dhabi available

Client Industry: IT Consultancy / Financial Services

Role And Responsibilities

  • Collaborate with stakeholders during requirements clarification and sprint planning to ensure alignment with business objectives.
  • Design and implement technical solutions on Lakehouse platform (Databricks), including:


Prototyping new Databricks capabilities.

Exposing these capabilities to support Data Products strategy, and Data & AI ecosystem.

  • Integrate data platforms with enterprise tools, including:
  • Incident and monitoring systems (e.g., ServiceNow).
  • Identity management solutions.
  • Data observability tools (e.g., Dynatrace).
  • Develop and maintain unit and integration tests to ensure quality and resilience.
  • Support QA teams during acceptance testing.
  • Act as a third-line engineer for production incidents, ensuring system stability and uptime.
  • Collaborate with cloud and infrastructure teams to deliver secure, scalable, and reliable solutions.


Job Requirements

  • Expert knowledge of Databricks.
  • Proficient in PySpark for distributed computing
  • Python for library development.
  • Advanced SQL skills for complex query optimisation (e.g., Oracle, MS SQL).
  • Experience with Git for version control.
  • Familiarity with monitoring tools (e.g., ServiceNow, Prometheus, Grafana).
  • Knowledge of scheduling tools (e.g., Stonebranch, Control-M, Airflow).
  • Proficiency in data quality frameworks (e.g., Great Expectations, ideally Monte Carlo).
  • Solid understanding of cloud infrastructure fundamentals (DNS, certificates, identity, load balancing).
  • Note: The primary focus is building platform capabilities rather than writing ETL pipelines.
  • Agile Practices: Comfortable with sprint planning, stand-ups, and retrospectives.
  • Collaboration Tools: Skilled in Azure DevOps for project management.
  • Problem-Solving: Strong debugging and troubleshooting skills for complex data engineering issues.
  • Communication: Exceptional written and verbal skills, able to explain technical concepts to non-technical stakeholders.

Similar Jobs

Explore other opportunities that match your interests

Staff DevOps Engineer

Devops
•
2h ago

Premium Job

Sign up is free! Login or Sign up to view full details.

•••••• •••••• ••••••
Job Type ••••••
Experience Level ••••••

Northrop Grumman

United State

Amazon Connect Engineer

Devops
•
2h ago
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Oliver James

United State
Visa Sponsorship Relocation Remote
Job Type Full-time
Experience Level Mid-Senior level

Bright Vision Technologies

United State

Subscribe our newsletter

New Things Will Always Update Regularly