Our client are seeking a Systems/Senior Data Engineer who will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi.
This role presents an excellent opportunity for an ambitious individual to develop and grow their skills and knowledge within a thriving organisation.
What you will do:
Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi.
Implement data ingestion, transformation, and integration processes, ensuring data quality and security.
Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards.
Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity.
Develop robust data models to support analytics and reporting within secure environments.
Perform troubleshooting, debugging, and performance tuning of data pipelines and the Elastic Stack.
Build dashboards and visualizations in Kibana to enable data-driven decision-making.
Ensure high availability and disaster recovery for data systems, implementing appropriate backup and replication strategies.
Document data architecture, workflows, and security protocols to ensure smooth operational handover and audit readiness.Must haves:
UK DV Clearance or the ability obtain it.
3+ years of experience working as a Data Engineer in secure or regulated environments.
Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization.
Strong experience with Apache NiFi for building and managing complex data flows and integration processes.
Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control.
Familiarity with data governance, data quality management, and compliance standards in secure environments.
Experience in managing large-scale, real-time data pipelines and ensuring their performance and reliability.
Strong scripting and programming skills in Python, Bash, or other relevant languages.
Working knowledge of cloud platforms (AWS, Azure, GCP) with a focus on data security and infrastructure as code.
Excellent communication skills with the ability to collaborate effectively with cross-functional teams.
Detail-oriented with a focus on ensuring data accuracy, quality, and security.
Proactive problem-solving mindset and ability to troubleshoot complex data pipeline issues.Nice to have:
Experience working in government, defence, or highly regulated industries with knowledge of relevant standards.
Experience with additional data processing and ETL tools like Apache Kafka, Spark, or Hadoop.
Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
Experience with monitoring and alerting tools such as Prometheus, Grafana, or ELK for data infrastructure.
Understanding of ML algorithms, their development and implementation.
Confidence developing end-to-end solutions.
Experience with infrastructure as code e.g. Terraform, Ansible.You must be capable of achieving full DV security clearance and will require access to caveated information.
For more information and guidance please visit (url removed)
Meridian Business Support is a recruitment specialist acting on behalf of our client as an Employment Agency for this vacancy