Data Platform Architect London Salary up to £120k (DOE) Job Overview What if each digital employee experience (DEX) was better than the last? Our client's platform helps IT teams improve end-user experience, tighten security, reduce costs, and evolve operations from cost centre to strategic enabler. Over one-third of the Fortune 100 rely on their single-agent solution with real-time automation and remediation for more visibility, control, compliance, and observability. At this organisation, you'll be part of a dynamic, collaborative team committed to reimagining how technology serves people, enhancing digital workplace experiences, and shaping the future of work. As a Data Platform Architect, you will play a pivotal role in shaping and executing the data strategy as the company transitions to a SaaS-only model by the end of 2025. You will collaborate closely with departments such as Engineering, Product Development, and Operations to design and implement cutting-edge data solutions that optimise cost, performance, and scalability. Ultimately, you will help unify data access, improve governance, and support AI initiatives that drive customer satisfaction and significant cost reductions per device. How you'll achieve success in this position: Lead the hands-on development and execution of a comprehensive data strategy focused on AI, scalability, availability, and cost efficiency. Migrate all data workloads from SQL Server to a data lake using Databricks for data ingestion, processing, and serving. Champion and implement best practices for Databricks across the organisation to ensure optimal alignment. Develop innovative solutions to create a fully elastic data layer that reduces platform costs and accelerates feature delivery. Restructure data entities to support AI, reporting, and platform insights, ensuring 100% of data is accessible through Databricks. Design and enforce a robust data governance framework for regulatory compliance and high data quality, ensuring the platform is FEDRAMP-ready. What you'll need to succeed: Extensive experience designing and implementing large-scale enterprise data solutions. Expertise in Databricks, including medallion architecture, Delta Sharing, Unity Catalog, and SQL Warehouse. Proven track record in developing data strategies for scalable platforms. Strong experience with cloud technologies like data lakes and Kafka, including ingestion technologies like structured streaming. Proficiency in Python and SQL for building and optimising ETL processes and data pipelines. In-depth understanding of data privacy, regulatory compliance, and governance frameworks. Excellent collaboration and mentorship skills to guide teams on best practices in Databricks implementation. Required Qualifications & Certification: Bachelor's or master's degree in Computer Science, Data Science, Information Technology, or a related field.