Description:
We are actively seeking Data Engineer for designing, building, and maintaining the infrastructure that supports data storage, processing, and retrieval. They work with large data sets and develop data pipelines that move data from source systems to data warehouses, data lakes, and other data storage and processing systems. Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs during the development, maintenance and sustainment of the KR data architecture and data-driven solutions.
Due to federal security clearance requirements, applicant must be a United States or Permanent with ability to obtain an active Secret clearance.
This is a contract to hire opportunity. Applicants must be willing and able to work on a w2 basis and convert to FTE following contract duration. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $80 - $86 / hr. w2
Responsibilities:
1. Develop, optimize, and maintain data ingest flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL
2. Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena
3. Communicate with data owners to set up and ensure configuration parameters
4. Document SOP related to streaming configuration, batch configuration or API management depending on role requirement
5. Document details of each data ingest activity to ensure they can be understood by the rest of the team
6. Develop and maintain best practices in data engineering and data analytics while following Agile DevSecOps methodology
Experience Requirements:
1. Strong analytical skills, including statistical analysis, data visualization, and machine learning techniques
2. Strong understanding of programming like Python, R, and Java
3. Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi
4. Proficient in programming like Java, Scala, or Python
5. Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs
6. Experience in traditional database and data warehouse products such as Oracle, MySQL, etc.
7. Experience in modern data management technologies such as Datalake, data fabric, and data mesh
8. Experience with creating DevSecOps pipeline using CI CD CT tools and GitLab
9. Excellent written and oral communication skills, including strong technical documentation skills
10. Strong interpersonal skills and ability to work collaboratively in dynamic team environment
11. Proven track record in demanding, customer service-oriented environment
12. Ability to communicate clearly with all levels within an organization
13. Excellent analytical skills, organizational abilities and problem-solving skills
14. Experience in instituting data observability solutions using tools such as Grafana, Splunk, AWS CloudWatch, Kibana, etc.
15. Experience in container technologies such as Docker, Kubernetes, and Amazon EKS
Education Requirements:
1. Bachelors Degree in Computer Science, Engineering, or other technical discipline required, OR a minimum of 8 years equivalent work experience
2. 8+ years of experience of IT data/system administration experience
3. AWS Cloud certifications are a plus
Skills, experience, and other compensable factors will be considered when determining pay rate. The pay range provided in this posting reflects a W2 hourly rate; other employment options may be available that may result in pay outside of the provided range.
#J-18808-Ljbffr