Develop and maintain robust, efficient, secure, and reliable data pipelines using azure technologies including Fabric, Azure data factory, Synapse analytics, databricks and Lakehouses. Build data pipelines that clean, transform, and present granular and aggregate data from disparate sources. Design, build, test, automate, and maintain architectures and processing workflows for analytics and business intelligence (BI) systems which are scalable. Plan, develop and evaluate methods and processes for gathering, extracting, transforming, and cleaning data and information. To undertake development of the data warehouse, including overall design, technical development and documentation of the data warehouse, infrastructure and ETL solutions covering multiple sources of data, working alongside NHSCFA specialists and external data suppliers. Please see full Job Description and Person Specification