Join to apply for the Data Engineer role at MUFG Pension & Market Services
A global, digitally enabled business that empowers a brighter future by connecting millions of people with their assets – safely, securely and responsibly. Through our two businesses MUFG Retirement Solutions and MUFG Corporate Markets, we partner with a diversified portfolio of global clients to provide robust, efficient and scalable services, purpose-built solutions and modern technology platforms that deliver world class outcomes and experiences. A member of MUFG, a global financial group, we help manage regulatory complexity, improve data management and connect people with their assets, through exceptional user experience that leverages the expertise of our people combined with scalable technology, digital connectivity and data insights.
Applications from London will be accepted
Overview
Roles within our RS EMEA Data (Satellite) team will orchestrate client's data onto their RS EMEA data environment and respective tenancy using the support, data and technical patterns and artefacts from RS Data Platform (Hub).
The Data Engineer will play a critical role in designing, developing, and implementing enterprise-wide data solutions. The position focuses on building scalable and reliable data pipelines, data integrations, and models to meet both internal and external customer needs. A strong mix of technical expertise, problem-solving skills, and business acumen is essential.
Key Accountabilities and main responsibilities
Strategic Focus
* Contribute to the development of advanced data engineering capabilities.
* Apply software engineering principles and database design to create scalable data pipelines, integrations, and models.
* Transform raw data into accurate, reusable datasets with a combination of technical expertise and business insight.
* Actively participate in driving a data-driven decision-making culture across the organisation.
Operational Management
* Design, develop, and optimise data solutions leveraging Snowflake, Fivetran, and DBT Cloud.
* Build robust data pipelines to ingest, transform, and load data into Snowflake using Fivetran and other ETL/ELT tools.
* Develop modular and reusable transformation models using DBT Cloud to maintain a clean, structured data warehouse.
* Implement advanced SQL solutions for data aggregation, transformation, and analysis in Snowflake.
* Ensure high data quality by implementing data validation techniques and monitoring pipelines for failures or inefficiencies.
* Utilise best practices for data warehousing, including dimensional and relational modelling.
* Execute against the Data and Analytics roadmap, focusing on the adoption of new capabilities and optimisation of existing systems.
Collaboration & Stakeholder Engagement
* Collaborate with cross-functional teams, including analysts, data scientists, and business stakeholders, to define actionable data solutions.
* Partner with other engineers to improve foundational data models and accelerate insights delivery.
Governance & Risk
* Adhere to data design and governance best practices.
* Maintain compliance with organisational standards, policies, and procedures.
* Develop and maintain comprehensive documentation for data definitions, processes, and pipelines.
Required Capabilities
Technical Expertise
* Strong hands-on experience with Snowflake, including:
* Data modelling, schema design, and performance optimisation.
* Writing and optimising SQL for large-scale datasets.
* Working with Snowflake-specific features like time travel, cloning, and zero-copy cloning.
* Proficiency in Fivetran for data ingestion and management.
* Connecting various source systems to Snowflake.
* Monitoring and troubleshooting data pipeline issues.
* Expertise in DBT Cloud for data transformation:
* Building modular DBT models with Jinja templating.
* Using hooks, and testing frameworks.
* Managing projects with version control and CI/CD pipelines.
* Strong knowledge of ETL/ELT processes and best practices.
Other Tools and Skills
* Experience with version control systems like Git.
* Understanding of cloud-native technologies, particularly in Azure or AWS environments.
Experience & Personal Attributes
Experience
* 7-10 years of experience in data engineering or related fields.
* Minimum 3-5 years of hands-on experience working with Snowflake, Fivetran, and DBT Cloud.
* Proven experience designing and implementing data warehouses using modern cloud architectures.
* Expertise in data pipeline monitoring and troubleshooting.
* Familiarity with data governance frameworks and data cataloguing tools.
Attributes
* Excellent problem-solving skills with a proactive approach to identifying and resolving issues.
* Effective communication skills to collaborate with global teams and non-technical stakeholders.
* Quality-oriented mindset with a strong attention to detail.
* Commitment to continuous learning and staying at the forefront of technology trends.
* Strong business product knowledge in UK Pensions and/or UK Financial Services Markets.
* Strong business acumen and passion for current, new and emerging technologies to enable and rollout to the business to improve customer experience.
* Identify and mitigate security risks and ensure IT security design and delivery.
* Exceptional skills in influencing and driving change across the organisation.
* Excellent verbal and written communication skills.
* Expresses ideas effectively in individual and group situations; uses and shares information resources effectively.
* Ability to adapt communication techniques to suit the needs of different individuals.
* Ability to prioritise, organise and plan and to meet demanding deadlines.
Seniority level
* Mid-Senior level
Employment type
* Full-time
Job function
* Analyst, Business Development, and Engineering
* Pension Funds
#J-18808-Ljbffr