We are looking for a Senior Data Engineer, to collaborate closely with our business and engineering teams to design, implement, and maintain data engineering solutions while ensuring security and data best practices. You will work need to be able to work independently and will be joining and mentoring a team of 5 excellent engineers. Experience of working within the Insurance/Insurtech/Payments sector would be highly advantageous.
Duties & Responsibilities:
* Design, build, and maintain scalable data pipelines for batch and real time data handling large volumes of structured and unstructured data.
* Develop, enhance, and optimize ELT processes to ingest, transform, enrich and publish data.
* Build quality, trusted, secure data products for consumption in reporting, analytics and science.
* Optimise data pipelines and queries for better performance and cost-efficiency.
* Integrate data pipelines with monitoring and observability to proactively detect and resolve issues before they impact business operations.
* Design and build data models for lake house storage and analytics.
* Implement and maintain CI/CD pipelines for data engineering.
* Collaborate with business teams, product owners, analysts and data scientists to understand their needs and ensure availability of relevant data.
* Capture and maintain data documentation enabling self-service communities to understand, assess and utilise reusable data assets.
* Research and evaluate cutting-edge data technologies, tools, and practices to improve data engineering processes.
Experience Required:
* Developing data processing pipelines in python and SQL for Databricks including many of the following technologies: Spark, Delta, Delta Live Tables, PyTest, Great Expectations (or similar) and Jobs.
* Developing data pipelines for batch and stream processing and analytics.
* Building data pipelines for enterprise data and one other business domain.
* Building and orchestrating data pipelines and analytical processing for streaming data with technologies such as Kafka, AWS Kinesis or Azure Stream Analytics.
* Knowledge of data governance, privacy regulations (e.g. GDPR), and security best practices.
* Delivering data engineering and designing for cloud native data platforms (AWS/ Azure/ Databricks).
* Building DevOps pipelines for data engineering solutions using Terraform, GitHub, DevOps.
* Working within highly agile multidisciplinary scrum teams in Scrum/Kanban.
* Mentoring data engineers, carrying out PR review.