Methods Business and Digital Technology Limited
Methods is a £100M+ IT Services Consultancy who has partnered with a range of central government departments and agencies to transform the way the public sector operates in the UK. Established over 30 years ago and UK-based, we apply our skills in transformation, delivery, and collaboration from across the Methods Group, to create end-to-end business and technical solutions that are people-centred, safe, and designed for the future.
Our human touch sets us apart from other consultancies, system integrators and software houses - with people, technology, and data at the heart of who we are, we believe in creating value and sustainability through everything we do for our clients, staff, communities, and the planet.
We support our clients in the success of their projects while working collaboratively to share skill sets and solve problems. At Methods we have fun while working hard; we are not afraid of making mistakes and learning from them.
Predominantly focused on the public-sector, Methods is now building a significant private sector client portfolio.
Methods was acquired by the Alten Group in early 2022.
Requirements
The Data Engineer will:
1. Work with other members of the delivery team across a mix of large and small projects and be responsible for translating data into valuable insights that inform decisions for small to large transformation projects and programmes.
2. Be responsible for identifying and using the most appropriate analytical techniques, developing fit-for-purpose, resilient, scalable and future-proof data services that meet user needs and design and write and iterate code from prototype to production-ready.
3. Communicate effectively across organisational, technical and political boundaries to understand the context and how to make complex and technical information and language simple and accessible for non-technical audiences.
4. Work with the others to support the growth and development of the team
Ideal Candidates will demonstrate:
5. Experience with Relational Databases and Data Warehousing experience such as SQL Server and/or Azure SQL
6. SQL development experience including debugging and troubleshooting
7. Have solid Relational Database design skills with an eye for performance optimisation
8. Experience of programming in languages such as Python, Scala or Java
9. Experience processing large volumes of structured/semi-structured data (CSV/JSON)
10. Experience producing data models and understand where to use different types of data models
11. Knowledge of data cleaning, wrangling, visualization and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks
12. Have cloud based experience, preferably with AWS and/or Azure
13. Have experience with ETL / ELT
14. Infrastructure, Azure Data Factory, Azure Synapse, SQL Server Integration Services (SSIS)
15. Have experience with Analytical/Reporting tools
16. SQL, Analysis Services, Power BI
17. DevOps experience, such as using Git, CI/CD and Unit Testing
18. Have attention to detail and ability to quality assure their own and other team members’ work
19. Understanding of how to expose data from systems ( through APIs), link data from multiple systems and deliver streaming services.
20. Know how to ensure that risks associated with deployment are adequately understood and documented.
21. An ability to translate business requirements into technical specifications
22. Experience of iterative/agile development methodologies such as SCRUM
23. A good understanding of Data Governance principles and the safe handling and processing of Personal Identifiable Data
Desirable Skills & Experience:
24. Streaming and real-time data
25. Knowledge of statistics principles necessary to interpret data and apply models. For example, knowledge of errors and confidence intervals to understand whether a relation seen in the data is real.
26. Exposure to high performing, low latency or large volume data systems ( 1 billion+ records, terabyte size database)
27. Understanding of distributed computing, columnar type databases, partitioning, map reduction
28. Big Data Frameworks experience, such as Hadoop and Apache Spark
29. Experience with NoSQL type environments, Data Lakes, Lake-Houses (Cassandra, MongoDB or Neptune)
Benefits
By joining us you can expect
30. Autonomy to develop and grow your skills and experience
31. Be part of exciting project work that is making a difference in society
32. Strong, inspiring and thought-provoking leadership
33. A supportive and collaborative environment
34. Development – access to LinkedIn Learning, a management development programme, and training
35. Wellness – 24/7 confidential employee assistance programme
36. Flexible Working – including home working and part time
37. Social – office parties, breakfast Tuesdays, monthly pizza Thursdays, Thirsty Thursdays, and commitment to charitable causes
38. Time Off – 25 days of annual leave a year, plus bank holidays, with the option to buy 5 extra days each year
39. Volunteering – 2 paid days per year to volunteer in our local communities or within a charity organisation
40. Pension – Salary Exchange Scheme with 4% employer contribution and 5% employee contribution
41. Discretionary Company Bonus – based on company and individual performance
42. Life Assurance – of 4 times base salary
43. Private Medical Insurance – which is non-contributory (spouse and dependants included)
44. Worldwide Travel Insurance – which is non-contributory (spouse and dependants included)
45. Enhanced Maternity and Paternity Pay
46. Travel – season ticket loan, cycle to work scheme
47. For a full list of benefits please visit our website (