Lumanity is dedicated to improving patient health by accelerating and optimizing access to medical advances. We partner with life sciences companies around the world to generate evidence to demonstrate the value of their product, translate the science and data into compelling product narratives, and enable commercial decisions that position these products for success in the market. We do this through strategic and complimentary areas of focus: Asset Optimization and Commercialization, Value, Access, and Outcomes, Medical Strategy and Communications, and Real-World Evidence. The Senior Data Engineer will be responsible for developing and maintaining data integrations across multiple SaaS systems, ensuring seamless data flow and accuracy. This role will support the enterprise reporting team by providing reliable data solutions and maintaining data quality. You will collaborate closely with cross-functional teams, including the Enterprise Applications Group, IT, business analysts, and system administrators, to comprehensively understand business requirements, establish data standards, design data models, manage change processes, and deploy robust solutions. These solutions must promote efficient data management, uphold stringent data quality and security standards, and ensure compliance with regulatory requirements. Your contributions will be pivotal in facilitating data-driven decision-making, streamlining operations, enhancing data integration, and driving efficiencies in our service delivery to life sciences customers. Essential Duties/Responsibilities Data Pipeline Development & Optimization: Design, build, and maintain scalable and robust data pipelines to integrate data from ERP and other enterprise systems. ERP System Integration: Collaborate with ERP specialists to ensure seamless integration of data between ERP systems (Oracle NetSuite and Kantata) and other enterprise systems. Enable BI reporting: Work alongside our reporting team to ensure business goals and needs are met through appropriate data provision. ETL Processes: Develop and maintain ETL processes to ensure efficient data extraction, transformation, and loading into data environments. Data Modelling & Architecture: Design and implement data models that meet business requirements, focusing on efficiency, reliability, and scalability. Data Governance & Quality: Establish and enforce data quality standards, monitoring the accuracy and integrity of data across systems. Represent data engineering on the change advisory board. Ensure compliance with regulatory requirements specific to the life sciences industry. Automation & Optimization: Automate routine data processing tasks to improve efficiency and accuracy across systems, identifying areas for optimization and performance improvements. Collaboration: Work closely with business analysts and key stakeholders to understand data requirements and deliver solutions that support analytics, reporting, and business operations. Documentation & Best Practices: Create detailed documentation of processes, data flows, and system integrations. Promote best practices in data engineering and system integration across the organization. Support & Troubleshooting: Provide technical support for data-related issues within dataflows and enterprise systems, ensuring minimal downtime and continuity of operations. Security & Compliance: Ensure data security best practices are followed and all processes comply with applicable regulations such as GDPR, HIPAA, and other life sciences-specific regulations. Qualifications - Bachelor’s degree in computer science, data science, engineering, or related field or equivalent college qualification or 5 years equivalent work experience. In role Experience – 5 years of relevant experience working in data engineering and data warehousing. Professional Expertise Designing and implementing data models for enterprise data initiatives. Leading projects involving data warehousing, data modelling, and data analysis. Programming languages such as Java, Python, and C/C++ and tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts leveraging Azure cloud services, real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Proficiency with relational database technologies and SQL programming, to include writing complex views, stored procedures, and database triggers Understanding of entity-relationship modelling, metadata systems, and data quality tools and techniques Experience with business intelligence tools and technologies such as Azure Data Factory, Power BI, and Tableau Learning and adopting new technology, especially in the ML/AI realm Collaborating and excelling in complex, cross-functional teams involving data scientists, business analysts, and other stakeholders Competitive salary Bonus Pension Health Insurance Dental Insurance Life Insurance Competitive Holiday / Annual Leave Electric Car Scheme LI-Hybrid