Job Description:
The ideal candidate will have at least 5 years of hands-on experience in building ETL/ELT pipelines using Snowflake, DBT, Python, and AWS. They should be proficient in leveraging various resources for data integration. Additionally, the candidate must possess a valid work visa to be eligible for employment in the UK.
* Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) - Mandatory skill.
* Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud) - Mandatory skill.
* Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
* Proficiency in SQL performance tuning and query optimization techniques using Snowflake.
* Troubleshoot and optimize DBT models and Snowflake performance.
* Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow.
* Strong analytical and problem-solving skills with an ability to work in an agile development environment independently.
* Ensure data quality, reliability, and consistency across different environments.
* Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
* Certification in AWS, Snowflake, or DBT is a plus.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
IT Services and IT Consulting
#J-18808-Ljbffr