GCP Data Engineer | £60,000 basic + package | Remote-first | London
About the Client
This is a chance to join a global business on the ground floor of a brand-new R&D function focused on the cutting edge of Synthetic Data Generation. Be one of the first on board and help shape the future of this exciting initiative!
About the Role
We're looking for a skilled and enthusiastic GCP Data Engineer to join this pioneering team. Initially, you'll focus on enabling data access for Data Scientists and building a local BigQuery data warehouse. As the function evolves, you'll move into the MLOps space, working with tools like VertexAI. This role is perfect for someone eager to learn, grow, and contribute to a truly innovative project.
Responsibilities:
* Provide data access and support to the Data Science team.
* Design and build a BigQuery data warehouse.
* Develop and maintain data pipelines using Airflow.
* Contribute to the development of MLOps processes and tools.
* Work with VertexAI and other GCP technologies.
* Utilize Terraform to manage and automate infrastructure.
What you need:
* Proven experience with GCP, particularly BigQuery.
* Strong data engineering skills, including data warehousing and ETL.
* Experience with data pipeline development, ideally with Airflow.
* Experience with infrastructure-as-code, using tools like Terraform.
* An interest in MLOps and Synthetic Data Generation.
* A desire to learn and grow with an innovative team.
To Apply:
If you're a motivated GCP Data Engineer who's ready to embark on an exciting journey, apply now!