Job Description
Role: Senior Cloud & Data Architect
Skills: DWBI, Big Data and GCP/AWS/Azure platforms and ETL and Data Modelling
Location: London UK
Type: Permanent
We are at Coforge hiring for Senior Cloud & Data Architect with DWBI, Big Data and GCP/AWS/Azure platforms and ETL and Data Modelling
* 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies
* Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks
* Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent).
* Excellent consulting experience and ability to design and build solutions, actively contribute to RFP response.
* Ability to be a SPOC for all technical discussions across industry groups.
* Excellent design experience, with entrepreneurship skills to own and lead solutions for clients
* Excellent ETL skills, Data Modelling Skills
* Strong knowledge of database structure systems and data mining.
* Knowledge of systems development, including system development life cycle, project management approaches and requirements, design and testing techniques
* Proficiency in data modelling and design, including SQL development and database administration
* Ability to implement common data management and reporting technologies, as well as the Columnar and NoSQL databases, data visualization, unstructured data, and predictive analytics.
* Excellent organizational and analytical abilities.
* Outstanding problem solver.
* Good written and verbal communication skills.
* A minimum of 5 years’ experience in a similar role.
* Ability to lead and mentor the architects.
* Design and implement effective database solutions and models to store and retrieve data.
* Examine and identify database structural necessities by evaluating client operations, applications, and programming.
* Ability to articulate and write POVs on new and old technologies
* Ability to recommend solutions to improve new and existing database systems.
* Assess data implementation procedures to ensure they comply with internal and external regulations.
* Install and organize information systems to guarantee functionality.
* Prepare accurate database design and architecture reports for management and executive teams.
* Oversee the migration of data from legacy systems to new solutions.
* Educate staff members through training and individual support.
* Offer support by responding to system problems in a timely manner.
* In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc.
* Ability to define the monitoring, alerting, deployment strategies for various services.
* Experience providing solution for resiliency, fail over, monitoring etc.
* Good to have working knowledge of Jenkins, Terraform, Stack Driver or any other DevOps tools.
Mandatory Skills
* At least 2 Hyperscalers
* GCP, AWS, Azure, Big data, Apache spark, beam on Big Query/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF
Desirable Skills
* Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, Stack Driver or any other DevOps tools