Job Purpose: To develop, enhance and maintain the data pipelines and data infrastructure across NWEH’s software products. To provide expertise and guidance for all data related work across the business. Key accountabilities: · The development, performance, management, and troubleshooting of the ETL processes, data pipelines and data infrastructure supporting NWEH’s software and service products · The effective and reliable operation of NWEH’s data pipelines and ETL processes · Appropriate adoption of new tools, technologies and practices to ensure the team stay up to date and follow industry best practice · Supporting data work across the team and wider business Key responsibilities: · Designing, developing, maintaining, and optimising data pipelines, ETL processes, and databases · Driving continuous improvement by refining processes, products and identifying new tools, standards, and practices · Working with teams across the business to define solutions, requirements, and testing approaches · Assisting with process definition, always ensuring compliance with organisation processes and regulatory standards · Ensuring compliance with regulatory requirements and standards and audit readiness · Automation and ongoing monitoring of data and data processes, ensuring data quality and integrity · Share knowledge and provide guidance on databases and data · Maintain up to date, accurate and concise documentation of database configuration and processes · Work across the team to deliver best practice infrastructure and infrastructure deployment and management processes Essential skills/experience · A good degree in a relevant subject or equivalent professional experience in a data role · At least 3 years’ professional experience developing data pipelines and ETLs using Microsoft products; at least 1 year working with cloud native technologies like Azure Data Factory · Demonstrable experience of delivering technical work within time and budget constraints · Good understanding of data security best practice · Experience of supporting ETLs or data pipelines crucial to a production system · Experience of working in a cross-functional team to deliver technical solutions Desirable skills · Experience with SQL Server, SSIS, Azure Data Factory and Azure SQL · Experience with Cloud and infrastructure as Code, particularly in an Azure setting using Bicep · Understanding of DevOps practices and the associated benefits · Skill in database testing; unit, performance, stress, security · Experience of working in an agile team · Experience of working in a highly regulated industry and with highly sensitive data · Exposure to large data solutions like Snowflake, Trino, Synapse, Azure Data Lake, and Databricks · Experience of data science using R, Stata or Python · Familiarity with Atlassian tools; JIRA, Confluence, BitBucket · Understanding of clinical trials, GCP and GxP Personal attributes · Collaboration – teamwork, listening and communication · Professionalism and commitment to delivery and improvement · Curiosity and continuous learning · Attention to detail but solution oriented · Adaptability