Company
Job location: Farringdon, UK
Job type: Full-time
Posted: 2d ago
Hosted by:
Job details
Data Engineer – Python, Linux, Apache Airflow, AWS or GCP
I’m working with a small but outstanding Data Analytics consultancy who are looking to recruit a Data Engineer with at least 2 years experience to work on a long term client project. They work with a very impressive client list to deliver bespoke data projects to drive decision making.
Key responsibilities:
1. Develop data pipelines using Python, SQL and cloud platforms (GCP ideally)
2. Integrate data from databases, data lakes, and other sources
3. Implement efficient ETL/ELT processes for high-quality, reliable data
4. Optimize pipeline performance and scalability
5. Collaborate with data teams to deliver impactful data solutions
Required skills:
1. 2 years experience in a data Engineering role
2. 2 years of work experience using Python
3. 2 years of work experience using Linux systems and their administration
4. 2 years of work experience using various databases (BigQuery, PostgreSQL, MSSQL, etc.)
5. Experience with Cloud Platforms, ideally GCP
6. Understanding of data modeling, ETL, and data quality best practices
7. Strong problem-solving and analytical skills
8. Excellent communication and collaboration abilities
This will be a great role in a small data consultancy that punches above their weight in dealing with many blue chip companies and offering end to end data services.
Salary: Circa £50k
Duration: Permanent
Location: Hybrid 4 days from home / 1 day from the office in Central London (Wednesday)
APPLY NOW
#J-18808-Ljbffr