AWS Data Engineer - Insight & Data Services - Permanent £550 - £650 Outside IR35 Base Location: Closest office to your home location / Hybrid working / Part Remote / UK-wide The Client: Our client's Insights and Data practice is the leading Data Science and AI Engineering provider in the United Kingdom with over 450 consultants serving the UK market. They are the true market leader The Role: As an AWS Data Engineer within the Insights & Data Emerging Tech Team, this role is a unique chance to make a real difference in your career and to make a difference that affects people's lives and transforms the way companies and governments operate. Do you want to amaze people, to take them on a journey and show them something truly fantastic? Do you want to be at the forefront of the AI revolution? The Focus of the Role: We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your work will be to: Design and build data engineering solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment Build relationships with client stakeholders to establish a high level of rapport and confidence Work with clients, local teams and offshore resources to deliver modern data products Work effectively on client sites, Capgemini offices and from home Use AWS Data focused Reference Architecture Design and build data service APIs Analyse current business practices, processes and procedures and identify future opportunities for leveraging AWS services Implement effective metrics and monitoring processes Essential Skills & Experienced needed: Have a deep, hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR) Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. Deep understanding of data manipulation/wrangling techniques Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working with CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools Have hands on experience with Infrastructure-as-Code technologies: Terraform, Ansible Capable of working in either an agile or Waterfall development environment, both as part of a team and individually E2E Solution Design skills - Prototyping, Usability testing Experience with SQL and NoSQL modern data stores. Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. Ability to translate business requirements into plausible technical solutions for articulation to other development staff. Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes Influencing and supporting project delivery through involvement in project/sprint planning and QA Nice to have - 'Desirable': Knowledge of other cloud platforms Google Data Products tools knowledge (e.g., BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) Relevant certifications Python Snowflake Databricks 83DATA is a boutique consultancy specialising in Data Engineering and Architecture | Data Science (ML, AI, DL) | Data Visualisation | RPA within the UK. We provide high-quality interim and permanent senior IT professionals.