As a Senior Data Engineer, you will be working as part of a multidisciplinary team to build solutions that make data accessible to enable solutions to be optimised by using an evidence-based approach. Engaging with our clients you will design and implement data solutions ensuring solutions are integrated with internal systems and business processes.
* Your role in a nutshell: Act as an SME within the Hippo squad to lead, design and implement data solutions that meet business requirements.
* Help architects realise data system design such as Meshes, Warehouses and Event based systems
* Implement data flows to connect operational systems, data for analytics and business intelligence (BI) systems
* Re-engineer, develop and optimise code to ensure processes perform optimally
* Build great relationships with your team and stakeholders
* Work with the Hippo community to share best practice to ensure high standards
* Involved in the recruitment and on-boarding of other engineers, and support other consultants in their professional development
* Promote Hippo’s Engineering Herd internally and externally (for example through writing Blogs, Workshops, Seminars or Conferences)
About the Candidate:
* Proven track record in Snowflake, Python, dbt, AWS and Terraform
* Proficiency in R, Bash, Java/.NET and/or PowerShell desirable
* Broad knowledge across cloud architectures, networking and distributed computing systems
* Experience of data system design such as Data Lakes, Data Meshes and Data Warehouses.
* Experience of a wide range of data sources, SQL, NoSQL and Graph.
* A proven track record of infrastructure delivery on any data platform (Snowflake, Elastic, Redshift, Data Bricks, Splunk, etc).
* Strong and demonstrable experience writing regular expressions and/or JSON parsing, etc.
* Strong experience in log processing (Cribl, Splunk, Elastic, Apache NiFi etc.)
* Expertise in the production of dashboard/insight delivery
* Be able to demonstrate a reasonable level of security awareness (An understanding of basic security best practices, OAuth, MFA, TLS, etc.)
* Experience in the processing of large datasets
* A firm understanding of data modelling and normalisation concepts
* Good estimation skills for times, latencies and costs.
* Experience working within multidisciplinary teams desirable
* Teamwork and presentation skills, experience of mentor led working practices