DATA ENGINEER (Spark, Kafka)
Start: ASAP
Duration: initial 6-months
Location: Hybrid, once per week in Windsor
Rate: inside IR35, paying up to £510 per day.
Responsibilities:
1. Design, implement and manage Kafka-based data pipelines, supporting real-time data processing.
2. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability.
3. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow.
4. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases.
5. Implement security measures to protect Kafka clusters and data streams.
Skills required:
1. Design, build, and maintain reliable, scalable data pipelines. Data Integration, Data Security.
2. Strong knowledge of data engineering tools and technologies (SQL, ETL, data warehousing).
3. Experience in tools like Azure ADF, Apache Kafka, Apache Spark SQL.
4. Programming languages such as Python, PySpark.
#J-18808-Ljbffr