Position: Data Engineer
Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration: Long Term B2B Contract
Job Description:
The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines using various resources.
1. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
2. Experience with DBT (Data Build Tool) for data transformation and modeling. Implement data transformation workflows using DBT (core/cloud).
3. Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
4. Proficiency in SQL performance tuning and query optimization techniques using Snowflake.
5. Troubleshoot and optimize DBT models and Snowflake performance.
6. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow.
7. Strong analytical and problem-solving skills with the ability to work independently in an agile development environment.
8. Ensure data quality, reliability, and consistency across different environments.
9. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
10. Certification in AWS, Snowflake, or DBT is a plus.
#J-18808-Ljbffr