DATA ENGINEER (Spark, Kafka)
Start: ASAP
Duration: initial 6-months
Location: Hybrid, once per week in Windsor
Rate: inside IR35, paying up to £500 per day.
Responsibilities:
- Design, implement, and manage Kafka-based data pipelines and messaging solutions.
Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability
- Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow
- Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases
Skills required:
- Data Integration, Data Security and Compliance
- Monitor and manage the performance of data systems and troubleshoot issues.
- Strong knowledge of data engineering tools and technologies (e.g. SQL, ETL, data warehousing), Experience in tools like Azure ADF, Apache Kafka, Apache Spark SQL, Proficiency in programming languages such as Python, PySpark
AMRT1_UKTJ
...