Salary: 45,000 - 70,000 GBP per year Requirements:
* Expertise in building, scheduling, and maintaining data pipelines.
* Proficient in Pyspark, Spark SQL, Hive, Python, and Kafka.
* Strong experience in Data Collection and Integration, Scheduling, Data Storage and Management, and ETL (Extract, Transform, Load) processes.
* Knowledge of relational and non-relational databases (e.g., MySQL, PostgreSQL, MongoDB).
* Good written and verbal communication skills.
* Experience managing business stakeholders for requirement clarification.
Responsibilities:
* Work closely with the development team to assess the existing Big Data infrastructure.
* Design and code Hadoop applications to analyze data compilations.
* Create data processing frameworks.
* Extract and isolate data clusters.
* Test scripts to analyze results and troubleshoot bugs.
* Create data tracking programs and documentation.
* Maintain security and data privacy.
Technologies:
* Big Data
* ETL
* Hadoop
* Hive
* Kafka
* MongoDB
* MySQL
* PostgreSQL
* Python
* PySpark
* SQL
* Security
* Spark
More:
We are a leading global consultancy actively seeking a contractor for a long-term contract within the ENERGY sector. This role is for a Hadoop Big Data Developer based in Windsor, utilizing a hybrid work style. The position offers a competitive rate of up to £400 per day (inside IR35) with an initial duration of 6 months and the possibility of extension. If you are interested and have the relevant experience, we encourage you to apply promptly, and we will reach out to discuss further.