Our client, a leading technology company, is currently seeking Senior/Expert Data Engineers for a contract position. This exciting opportunity involves migrating an existing Data Lake, currently accessed via SAS, into the Google Cloud. The project has a strict deadline of June 30th, and specialists with a background in Data Engineering and Data Centres are encouraged to apply.
Key Responsibilities:
1. Assist with the migration of the existing Data Lake to Google Cloud
2. Utilise SAS, SQL, dbt, git, Airflow (Python), and ODE Generic Export Framework for data processing and programming
3. Employ additional tools and software such as KNIME, VS Code, DIL-Pipelines, InnovatorX, and Iceberg/Biglake Tables
4. Collaborate with the team to meet the project deadline
5. Maintain documentation of the migration process
6. Apply industry knowledge, particularly in fibre optic marketing, if applicable
Job Requirements:
1. Experience in data engineering and data centre technologies
2. Proficiency in SAS, SQL, dbt, git, and Airflow (Python)
3. Familiarity with ODE Generic Export Framework
4. Knowledge of tools such as KNIME, VS Code, DIL-Pipelines, InnovatorX, and Iceberg/Biglake Tables
5. Understanding of the Google Cloud platform
6. Strong problem-solving skills and attention to detail
7. Ability to work under strict deadlines and in a collaborative team environment
8. Optional but preferred: Understanding of fibre optic (Glasfaser) marketing
If you are a specialist in data engineering looking for a challenging contract role, we would love to hear from you. Apply now to join our client's project and contribute to their success in migrating to Google Cloud.
#J-18808-Ljbffr