Required Core Skills:
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Strong proficiency in Java and Spring Boot.
• Experience with Apache Kafka and stream processing.
• Familiarity with Big Data technologies (Hadoop, Spark, etc.).
• Knowledge of NoSQL databases (e.g., Druid, Cassandra, MongoDB).
• Understanding of distributed systems and scalability.
• Design, develop, and implement Kafka-based microservices using Spring Boot.
• Build data pipelines for ingesting, processing, and analyzing large-scale data sets.
• Optimize Kafka configurations for performance and reliability.
• Work with Big Data technologies such as Hadoop, Spark, and NoSQL databases.
• Ensure data security, integrity, and compliance with industry standards.
• Troubleshoot and resolve issues related to Kafka topics, consumers, and producers.
• Monitor system performance and proactively address bottlenecks.
• Participate in code reviews and mentor junior developers.
Nice to have skills:
• Certification in Kafka or related technologies.
• Experience with cloud platforms (AWS, Azure, GCP).
• Knowledge of containerization (Docker, Kubernetes)
Minimum years of experience: 12 years