A leading UK logistics company seeks a senior data engineer to lead the design, development, and maintenance of robust data pipelines and infrastructure within their Microsoft Azure environment. Your expertise will ensure the availability, reliability, and scalability of their data platform, empowering the organisation with timely and accurate information.
You will collaborate with the data science and analytics teams to implement data models, create data integration pipelines, and optimise data workflows. This role requires a strong background in data engineering, with a solid understanding of data management principles and techniques. You will support the development of junior data engineers, providing guidance and mentorship to help them build their skills.
* Lead the support of the strategic data platform, ensuring it is robust, reliable, and cost-effective in providing high-quality data.
* Design, develop, and uphold data integration processes, including the creation and execution of scalable and effective data pipelines for managing and transforming large datasets.
* Optimise data storage and retrieval processes to meet performance and scalability requirements.
* Ensure data quality and consistency by implementing data validation and cleansing techniques.
* Collaborate with stakeholders to define requirements and integrate new data sources.
* Monitor and troubleshoot data pipeline issues, identify and resolve bottlenecks, and implement performance optimisations.
* Collaborate with the business intelligence team to define and implement data models for analysis and reporting purposes.
* Maintain best practices, governance, and security in data projects.
* Stay up-to-date with the latest trends and technologies in the field of data engineering, and make recommendations for improvement.
* Provide guidance and mentorship to junior data engineers, supporting their development and skill-building.
Key requirements include:
* Bachelor's degree in computer science, engineering, mathematics, or a related field.
* Strong programming skills in Python, Java, or Scala, with proficiency in database management systems and excellent SQL skills.
* Proven experience in designing and implementing data pipelines, ETL processes, data warehousing, and scaled data consumption patterns.
* Knowledge of data integration, data modelling concepts, and familiarity with cloud data platforms and storage technologies, ideally within Azure.
* Proficient in Azure data services, including Synapse Analytics, SQL, Data Factory, Data Lake, Databricks, and Cosmos DB.
* Experience in Agile development, branch-based source control/DevOps concepts, and gathering and analysing system requirements.
* Excellent problem-solving, troubleshooting, communication, and collaboration skills.
* Attention to detail and a commitment to delivering high-quality work.
This is a 12-month fixed-term contract/fully remote position with opportunities to work in a complex data environment and contribute to a large-scale data and digital modernisation programme.