Social network you want to login/join with:
At Zimmer Biomet, we believe in pushing the boundaries of innovation and driving our mission forward. As a global medical technology leader for nearly 100 years, a patient’s mobility is enhanced by a Zimmer Biomet product or technology every 8 seconds.
As a Zimmer Biomet team member, you will share in our commitment to providing mobility and renewed life to people around the world. We focus on development opportunities, robust employee resource groups (ERGs), a flexible working environment, location-specific competitive total rewards, wellness incentives, and a culture of recognition and performance awards. We are committed to creating an inclusive environment where every team member feels respected, empowered, and recognized.
What You Can Expect
The Senior Data Platform Engineer will develop data platforms that serve our development teams and data products to improve the quality of care and life for Orthopaedic patients worldwide. Our Connected Health AI and Data Science Team’s platform is evolving from batch processing to accommodate real-time and Generative AI solutions. The Data Platform Engineer will play a key role in driving this growth and adoption.
How You'll Create Impact
* Collaborate with machine learning scientists and engineers to develop data-serving platforms on Microsoft Azure that process real-time and batch data from diverse sources;
* Integrate third-party platforms and software to streamline data onboarding and manipulation;
* Identify system improvements to enhance backend efficiency and user experience;
* Develop reusable architectures and infrastructure using Infrastructure as Code for current and future products;
* Ensure regulatory compliance through automation;
* Contribute to shaping the technological roadmap of the Connected Health team.
What Makes You Stand Out
Proficiency in:
* Apache Spark for data ingestion and transformation;
* SQL databases;
* Infrastructure as code tools like Terraform, Pulumi, Bicep;
* Git and CI/CD pipelines.
Your Background
Experience with:
* Strong communication skills for working within cross-disciplinary teams;
* Programming in Python and related data engineering packages;
* Machine learning techniques and applications;
* Apache Spark and Apache Airflow for ETL pipelines;
* Developing cloud-native applications;
* Monitoring data pipelines, applications, and infrastructure with tools like NewRelic;
* Understanding infrastructure concepts such as VMs and networking;
* Communicating technical analyses effectively;
* Learning new technologies and methodologies.
Beneficial but not essential experience in:
* Healthcare data management;
* Deploying applications on Kubernetes or managed container environments;
* Azure data tools like Synapse or Fabric;
* Delivering software or AI/ML solutions in regulated environments.
This role is home-based with a remote-first culture. The team meets biweekly in Central London, with flexible arrangements. Occasional evening meetings and travel to the U.S. and Europe may be required.
#J-18808-Ljbffr