At Airtime we are all about innovation, because this is how we stay on top. Every one of us has a hunger to succeed and will stop for nothing less than excellence. Crucially, our ethos is underpinned by a culture of teamwork and our shared humility because all that we achieve, we achieve together.
The Opportunity
We are looking to recruit a Data Engineer to join and strengthen our existing team and contribute to the growth and development of our data services.
This will be an exciting role, with the chance to work in a highly skilled technical role that offers great opportunities for learning and personal development.
The ideal candidate will be passionate about data, with a technical mindset and excellent problem-solving skills. The role will suit someone who is curious, detail-oriented, and looking to rapidly develop their skills in a dynamic environment.
The ideal candidate should be able to:
* Design, implement, and maintain robust, scalable data pipelines to ingest data from internal platforms into our data warehouse.
* Optimise data pipelines to enhance performance, reduce costs, and ensure data quality.
* Understand, gather, and document detailed business requirements.
* Take ownership of data projects from planning to delivery, collaborating with other departments as needed.
* Innovate and automate current processes, driving continuous improvement.
* Express desire for operational excellence & commitment to secure coding standards and best practices.
Requirements
* Bachelor’s degree in Computer Science, a relevant technical field, or equivalent experience.
* Experience designing cloud data warehouse solutions, data modelling, and building ETL/ELT processes, preferably on GCP or equivalent platforms (AWS, Azure, Snowflake).
* Strong knowledge of SQL, Python, Docker, and Terraform (or similar IaC tools).
* Experience with Dataform or dbt.
* Strong knowledge of security best practices, data privacy, and GDPR compliance.
* Proven track record optimising data pipelines & query performance.
* Experience in building CI/CD pipelines for automated deployments.
* Familiarity with GCP products, especially: BigQuery, Composer, Cloud Functions.
* Knowledge of data science fundamentals (supervised/unsupervised learning, hyperparameter optimisation, model performance evaluation).
* Experience in building AI/ML models and ML pipelines.
* Hands-on experience with data visualisation and BI tools.
* Prior experience in the Fintech sector working with open banking data.
Seniority level
* Associate
Employment type
* Full-time
Job function
* Other
Industries
* Telecommunications
#J-18808-Ljbffr