My client is a Global IT Consultancy, who are currently looking for multiple Data Engineers to join their teams in London and Manchester. This is a permanent position and represents a unique opportunity for someone to enhance their digital career.
AWS Data Engineer - Permanent
Salary guideline: £60,000 - £105,000 pa (DOE) + Bonus, Pension up to 6% contributory, Health Insurance, Life Assurance etc.
Base Location: UK Wide - Hybrid model - Onsite/ Remote
The Role:
Essential Skills and Experience:
* Deep, hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements.
* Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR).
* Java, Scala, Python, Spark, SQL.
* Experience of developing enterprise grade ETL/ELT data pipelines.
* Deep understanding of data manipulation/wrangling techniques.
* Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review).
* Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion.
* NoSQL Databases: Dynamo DB/Neo4j/Elastic, Google Cloud Datastore.
* Snowflake Data Warehouse/Platform.
* Streaming technologies and processing engines: Kinesis, Kafka, Pub/Sub, and Spark Streaming.
* Experience of working with CI/CD technologies: Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible, etc.
* Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools.
* Hands-on experience with Infrastructure-as-Code technologies: Terraform, Ansible.
* Capable of working in either an agile or Waterfall development environment, both as part of a team and individually.
* E2E Solution Design skills - Prototyping, Usability testing.
* Experience with SQL and NoSQL modern data stores.
* Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language.
* Ability to translate business requirements into plausible technical solutions for articulation to other development staff.
* Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes.
* Influencing and supporting project delivery through involvement in project/sprint planning and QA.
Also:
* Knowledge of other cloud platforms.
* Google Data Products tools knowledge (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) Relevant certifications.
* Python.
* Snowflake.
* Databricks.
To apply please click the "Apply" button and follow the instructions.
For a further discussion, please contact Aaron Perdesi.
#J-18808-Ljbffr