Description
Curve was founded with a rebellious spirit, and a lofty vision; to truly simplify your finances, so you can focus on what matters most in life. That's why Curve puts your finances simply at your fingertips, so you can make smart choices on how to spend, send, see and save your money. We help you control your financial life, so you can go out and live the life you want to live.
With Curve you can spend from all your accounts, track spend behaviour and provide insights, and security to protect you from fraud. For the first time giving you bright insights and control of all your money in one beautiful place.
We're developing a ground-breaking product with our customers at the core. Our user base is growing rapidly and we have exceptional metrics. We have funding from the leading names in tech investment, and a visionary leadership team who wants everyone who joins this remarkable adventure, to have the autonomy to masterfully develop their expertise.
Welcome to Curve. On a mission to help you live inspired.
We're looking for a capable Senior Analytics Engineer to join our central data team. Our mission is to build a robust, scalable platform that transforms raw data into clean, modelled datasets using dbt, empowering stakeholders across the company with reliable, accessible data. Your focus will be on developing and optimising data models that enable efficient and impactful analytics and reporting. You will work closely with both data engineers and analysts, bridging the gap between raw data collection and actionable insights.
This role is ideal for individuals passionate about data transformation, data modelling, and building scalable, maintainable workflows using dbt in a cloud-based environment. If you thrive on solving complex data challenges, care deeply about data quality, and are excited about implementing best practices in analytics engineering, this is the perfect role for you.
Key Accountabilities:
1. Write production-quality ELT code with an emphasis on performance, maintainability, and scalability using dbt.
2. Transform, maintain, and model clean datasets within the data warehouse for broader consumption by business teams.
3. Apply software engineering best practices such as version control (e.g., Git) and CI/CD pipelines for analytics code deployment.
4. Design, develop, and maintain dashboards and reports that communicate key performance indicators (KPIs), particularly for Curve Credit, enabling data-driven decision-making.
5. Build and maintain strong, collaborative relationships with cross-functional teams, including product, marketing, finance, and operations.
6. Partner with data engineering to develop tools, infrastructure, and data pipelines that improve data accessibility and enable data self-service across the organisation.
7. Implement automated data quality checks and monitoring to ensure the integrity and reliability of transformed datasets.
Skills & Experience:
1. 3+ years of experience in data/analytics engineering, focusing on data transformation and modelling, particularly using dbt.
2. Expertise in ELT processes, transforming raw data into well-structured, high-quality data models in a data warehouse (e.g., BigQuery).
3. Strong proficiency with SQL for data modelling and performance optimisation.
4. Experience applying software engineering best practices such as version control (Git), CI/CD, and modular code design to analytics workflows.
5. Experience building dashboards and reports to track business-critical metrics, with a focus on driving business insights.
6. Ability to collaborate effectively with cross-functional teams and communicate technical solutions to non-technical stakeholders.
7. Familiarity with orchestration tools like Airflow or Composer and cloud infrastructure, particularly GCP (BigQuery).
8. Strong attention to detail with a focus on data accuracy, quality, and process improvement.
9. Strong understanding of data quality principles.
10. A record of learning new technologies and tools.
11. You are experienced with database technologies. (Best Practice, Performance Optimisation, Fault Finding)
Nice to haves:
1. Experience with real-time and streaming data pipelines.
2. Familiarity with infrastructure as code (Terraform) or container orchestration (Kubernetes).
3. Experience mentoring or supporting junior team members in analytics engineering.
4. Fintech, Finance, Payments, or Retail Banking industry experience.
Benefits:
1. 25 days plus bank holidays.
2. Bonus days off for Learning & Development, Mental Wellbeing, Birthday, Moving House & Christmas.
3. Working abroad policy (up to 60 calendar days per year).
4. Bupa Health Insurance (YuLife).
5. Life insurance powered by AIG (5x Annual Salary).
6. Pension Scheme powered by "People's Pension" (4% Matched).
7. EAP (Mental health & wellbeing support, Life coach, Career coach).
8. 24/7 GP access (Smart Health via YuLife).
9. Annual subscriptions to Meditopia & FIIT for your mind and body (via YuLife).
10. Discounted shopping vouchers (via YuLife).
11. Enhanced parental leave.
12. Ride to work scheme & Season ticket loan.
13. Electric car scheme.
14. Six nights of Night Nanny for new parents.
15. Free Curve Metal subscription for you and your +1.
#J-18808-Ljbffr