As the Data Platform Engineer in the Cyber Data Platform team, you will contribute to the development and maintenance of a robust data platform that enables users to develop advanced analytics, machine learning, and GenAI solutions to strengthen our security defences. You will collaborate closely with cross functional data teams to ensure the platform is scalable, secure, and aligned with cybersecurity goals. We know life looks a little different for each of us. That’s why at Tesco, we always welcome chats about flexible working. Some people are at the start of their careers, some want the freedom to do the things they love. Others are going through life-changing moments like becoming a carer, nearing retirement, adapting to parenthood, or something else. So, talk to us throughout your application about how we can support. Data Architecture: Support the design, build and management of realtime, near realtime and batch data architectures that support threat detection, incident response and reporting through advanced analytics, machine learning and GenAI capabilities. Data Integration and Transformation: Implement and manage the automated frameworks for integrating the data from various security sources into the security data lake. Also, the data transformation frameworks to move data between the raw, trusted, and curated data layers. Coding and Documentations: Contribute to raising the quality bar of the team's codebase by producing high-quality code and documentation of data platform architecture, process and best practices. Automation and DevOps: Implement various automations and DevOps practices to streamline the deployment, configuration, and management of data platform components. Collaboration and Communication: Collaborate and communicate effectively with members of the team and actively participate in resolving issues related to data platform operations, providing support as needed. Proficiency in SQL and programming languages like Python Working knowledge of ETL and ELT frameworks and orchestration tools like Airflow and dbt Experience working with cloud platforms e.g. Azure Knowledge of data lakes and data warehouses concepts and distributed systems like Kafka, Spark etc., Familiarity of Kubernetes, CI/CD and Terraform Knowledge of cybersecurity principles and practices