Hi - we are Dynamon – and we are utilising innovative simulation analytics to revolutionise decision making and decarbonisation within the transport industry. Our tools help protect the environment by helping the world’s largest road transport operations eliminate their CO2 emissions, whilst simultaneously reducing their costs. Governments worldwide have legislated that, in the coming decade, all vehicles will be zero-emission, and this presents a huge challenge to the commercial transport industry. Dynamon’s flagship product ZERO is a ground-breaking EV fleet electrification tool that solves this problem. Dynamon is growing ZERO into a global SaaS service and it is becoming the go-to resource for EV procurement and operation. As one of the fastest growing start-ups in the UK we are uniquely positioned to help transition the tens of millions of highly polluting combustion engine vehicles into clean EVs, and to ensure the UK meets its zero-emission by 2035 targets. To boost distribution of our tools and bring new features to market we are expanding our product teams. We are looking for a Data Engineer who’s driven to get the best out of our data ingestion and processing tools, using the latest technology and helping revolutionise an outdated industry. Why You’ll Love This Role Working closely with the product, design, and analytics teams you will be helping make our analytics products work seamlessly with customer data. Your day-to-day role will include: Connecting to commercial vehicle fleet telemetry via API tools (telematics) and sourcing the wealth of granular vehicle operational data that drives our models. Performing translation and load operations to ingest data to our modern lakehouse data stack. Creating autonomous solutions to everyday problems that enhances the SaaS data handling capability of our software. Using cloud IaaS and distributed compute to enhance the performance of ETL and data modelling tasks. Shipping new code regularly, to match our development cycles within an AGILE / SCRUM project management framework, you will be adding value to our customer product experience every week. Improving data quality by understand typical behaviors of commercial vehicles and cleansing anomalies or discrepancies within the raw telematics data, to provide a superior product experience. Researching and prototyping of new methods, technologies and software products to take our solution to the next level. Just let us know what additional skills you bring or are developing, and we can put them to use Why We'll Love You Essential qualities/experience for a successful candidate are Creative and methodological problem-solving skills. A background in STEM or computer science. OR significant experience in a relevant field. You have a strong understanding of how to manage, store and manipulate data. Also desirable are Experience working with geographic / geospatial and time-series datasets. Experience with Python or other equivalent scripting languages (Pandas, PySpark or similar), as well as containerization (Docker) and source control (Git). A good understanding of common data exchange formats e.g. XML, JSON, CSV, YAML, Parquet Experience processing big data and working with databases (PostgreSQL or similar) or datalakes (Databricks or similar). An understanding of common API architectural styles using REST and SOAP. Confidence working as part of a small team or independently. Experience communicating technical results to varied audiences. Dynamon provides an environment which enables you to do your best work and showcase your unique skills and interests. Dynamon will support you in your career development journey through training and ownership of development projects. We have a supportive working environment and provide personalised benefits. There's an amazing camaraderie among the team, and we regularly like to get together and let our hair down a bit with regular company away days.