Senior Data Engineer
- Bristol - 90% onsite
- 6 month contract
- £78.70 per hour, outside IR35
- Sole UK national and DV Clearance required
This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with the option of compressed hours.
The role will include:
- Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi.
Implement data ingestion, transformation, and integration processes, ensuring data quality and security.
- Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards.
- Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity.
- Develop robust data models to support analytics and reporting within secure environments.
- Perform troubleshooting, debugging, and performance tuning of data pipelines and the Elastic Stack.
- Build dashboards and visualizations in Kibana to enable data-driven decision-making.
- Ensure high availability and disaster recovery for data systems, implementing appropriate backup and replication strategies.
- Document data architecture, workflows, and security protocols to ensure smooth operational handover and audit readiness.
TECHNICAL SKILLS
Must Have
• UK DV Clearance or the ability obtain it
• 3+ years of experience working as a Data Engineer in secure or regulated environments.
• Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization.
• Strong experience with Apache NiFi for building and managing complex data flows and integration processes.
• Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control.
• Familiarity with data governance, data quality management, and compliance standards in secure environments.
• Experience in managing large-scale, real-time data pipelines and ensuring their performance and reliability.
• Strong scripting and programming skills in Python, Bash, or other relevant languages.
• Working knowledge of cloud platforms (AWS, Azure, GCP) with a focus on data security and infrastructure as code.
• Excellent communication skills with the ability to collaborate effectively with cross-functional teams.
• Detail-oriented with a focus on ensuring data accuracy, quality, and security.
• Proactive problem-solving mindset and ability to troubleshoot complex data pipeline issues
Nice To Have
• Experience working in government, defence, or highly regulated industries with knowledge of relevant standards.
• Experience with additional data processing and ETL tools like Apache Kafka, Spark, or Hadoop.
• Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
• Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure.
• Understanding of ML algorithms, their development and implementation
• Confidence developing end-to-end solutions
• Experience with infrastructure as code e.g. Terraform, Ansible
TPBN1_UKTJ