Role summary: This is an exceptional opportunity to apply and develop your skills within our high-performing Data Engineering team working on exciting and unique client projects in the nuclear industry. Our mission is to inspire the next generation of problem solvers. As a Data engineer within our Data Engineering team, you will design, implement and maintain our Cloud infrastructure for data processing and analysis. What you will be doing: Developing ETL (Extract, Transform, Load) processes to clean, enrich, and aggregate data. Designing and implementing data pipelines to ingest data from various sources (both Cloud and on-premises) into Azure. Transforming raw data into usable formats using tools like Azure Data Factory, Databricks, or Azure Synapse Analytics. Perform data transformations using Azure Synapse Analytics’ serverless SQL Pools, Apache Spark pools, or Azure Databricks. Develop scalable and efficient data processing solutions using Synapse Spark, Scala, Python or SQL. Implement data cleansing, deduplication and enrichment processes Creating and maintaining data models (relational, non-relational, or hybrid) using Azure Synapse Analytics dedicated SQL pools or Azure Cosmos DB. Optimize data storage and partitioning strategies for efficient querying and analysis Ensure data security and governance by implementing row-level security, dynamic data masking, and auditing Integrate Azure Synapse Analytics with Power BI for data visualization and reporting. Implement incremental data refresh and DirectQuery models in Power BI for real-time reporting. Develop Azure Synapse Pipelines or Power Automate flows to automate data processing. Ensure compliance with data privacy and regulatory requirements in accordance with prevailing UK or country of deployment's relevant regulations and law Optimizing queries, indexes, and storage to improve efficiency. Implement data security and governance policies using Azure Active Directory, Azure Key Vault, and Azure Purview, working closely with Information and Security Architects to ensure solutions meet governance requirements for required client solutions Collaborate with data governance teams to establish and enforce data standards and best practices Work closely with consultants, architects and SMEs, data analysts, data scientists, and business stakeholders to understand and capture data requirements Establishing version control practices for data workflows, pipeline code and configurations using git and Azure DevOps. Promoting collaboration and code review processes throughout the Mission Optimization team. Maintaining comprehensive documentation for data pipelines, configurations, and deployment procedures. Sharing best practices, knowledge, and guidelines with team members to enhance their understanding of Data Engineering principles and practices. Formal qualifications or training: Azure Devops (CI/CD) or infrastructure as code (BICEP) are a strong advantage. Make a Difference with DBD: At DBD, we know you're looking for more than just a job - you aspire to make a real impact in the nuclear industry. We offer unique opportunities for growth, empowering you to take influential roles within client organisations. Our dedicated team collaborates with clients to positively influence projects across Defence, Decommissioning and New Builds. Having doubled in headcount year-on-year, over the past two years, we continue to welcome new, talented individuals to our team. We're committed to your success. DBD invests in your development and supports your career trajectory in any direction you want to take it. Join us to play a key role in shaping the nuclear sector's future. Make a difference today with DBD. Benefits: up to 20% bonus, 25 days holiday, enhanced pension, private health insurance, private dental, and more