We offer • Up to £80K base salary • 10% Bonus • 6% pension contribution • Private Medical • 25 days annual leave • Access to our comprehensive flexible benefits including discounts on big brands, wellness and employee assistance programmes, gymflex, buy and sell annual leave, travel and dental insurance • Work. Life. Smarter. Our commitment to a flexible and hybrid working culture Are you passionate about data and looking for an opportunity to work on a greenfield project that will transform existing legacy platform of a leading communications infrastructure and media services company? If yes, then we have an exciting role for you We are Arqiva, a company that provides critical data, network, and communications services to broadcasters, utilities, and other customers in the UK and around the world. We are looking for a data engineering lead to join our team and deliver upon a major transformation project that will build a modern lakehouse solution using cutting-edge technologies such as Spark, Databricks, AWS, Kinesis, Airflow, Qlik, Great Expectations, GitHub, CodePipeline, Terraform, and Python. As a Data Engineering Lead at Arqiva, you will: • Work closely with the data engineering manager, business analysts, data architects, and other stakeholders to understand the business requirements and design the data lakehouse solution. • Lead and mentor a team of data engineers and developers in building, testing, deploying, and maintaining scalable, reliable, and secure data pipelines and platforms using best practices of software engineering such as code quality, testing, documentation, version control, continuous integration, and continuous delivery. • Implement data ingestion, processing, storage, and delivery systems using Spark, Databricks, AWS, Kinesis, Kafka, Airflow, Glue, Qlik, Great Expectations, GitHub, CodePipeline, Terraform, and Python. • Optimize the performance, reliability, and cost-efficiency of the data lakehouse solution. • Troubleshoot and resolve any data-related issues and ensure the data quality and integrity using Great Expectations or other tools. • Research and evaluate new technologies and trends in the field of data engineering and propose innovative solutions to improve our data capabilities. To be successful in this role, you will need: • A bachelor’s degree or higher in computer science, engineering, mathematics, statistics, or a related field. • At least 5 years of proven experience in data engineering or a similar role. • Strong knowledge and experience in working with Spark, GitHub, CI/CD, Python, AWS, and other related technologies. • Strong big data development background and experience in building scalable, distributed, and fault-tolerant systems using microservices architecture and API strategy development. • Excellent SQL skills and experience in working with various relational and non-relational databases. • Good to have experience in working with Databricks, Glue, Kafka, Airflow, Great Expectations, Qlik, Terraform or other related technologies. • Excellent communication skills (both verbal and written) and ability to present complex technical concepts in a clear and concise manner. • Strong problem-solving skills and analytical thinking skills. • Ability to work independently as well as collaboratively in a fast-paced environment