Senior Data Engineer
12 months
£320 p/d INSIDE IR35
Hybrid - on site every Tuesday in Basildon
My client, a well-known vehicle manufacturer, is looking for a Senior Data Engineer to join their fast-paced team on an initial 12-month contract.
Responsibilities:
1. Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.
2. Build & maintain data products in accordance with GDI&A Data Factory standards, ensuring adherence to data quality, governance, and control guidelines.
3. Develop and automate scalable cloud solutions using GCP native tools (e.g., Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, Big Query) and Apache Airflow.
4. Operationalize and automate data best practices: quality, auditable, timeliness, and completeness.
5. Monitor and enhance the performance and scalability of data processing systems to meet organizational needs.
6. Participate in design reviews to accelerate the business and ensure scalability.
7. Advise and direct team members and business partners on company standards and processes.
Skills Required:
1. Develop custom cloud solutions and pipelines with GCP native tools e.g. Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, and Big Query - Proficiency in SQL, Python, and PySpark.
2. Expertise in GCP Cloud and open-source tools like Terraform.
3. Experience with CI/CD practices and tools such as Tekton.
4. Knowledge of workflow management platforms like Apache Airflow and Astronomer.
5. Proficiency in using GitHub for version control and collaboration.
6. Ability to design and maintain efficient data pipelines.
7. Familiarity with data security, governance, and compliance best practices.
8. Strong problem-solving, communication, and collaboration skills.
9. Ability to work autonomously and in a collaborative environment.
10. Ability to design pipelines and architectures for data processing.
11. Experience with data security, governance, and compliance best practices in the cloud.
12. An understanding of current architecture standards and digital platform services strategy.
13. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.
14. Meticulous approach to data accuracy and quality.
Experience Required:
1. Programming and scripting experience with SQL, Python, and PySpark.
2. Ability to work effectively across organizations, product teams, and business partners.
3. Knowledge of Agile Methodology, experience in writing user stories.
4. Demonstrated ability to lead data engineering projects, design sessions, and deliverables to successful completion.
5. GCP Cloud experience with solutions designed and implemented at production scale.
6. Knowledge of Data Warehouse concepts.
7. Experience with Data Warehouse/ETL processes.
8. Strong process discipline and thorough understanding of IT processes (ISP, Data Security).
9. Critical thinking skills to propose data solutions, test, and make them a reality.
Experience Preferred:
1. Excellent communication, collaboration, and influence skills; ability to energize a team.
2. Hands-on experience in Python using libraries like NumPy, Pandas, etc.
3. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub/Kafka, Looker Studio, VertexAI.
4. Experience with recording, redeveloping, and optimizing data operations, data science, and analytical workflows and products.
#J-18808-Ljbffr