Role : Senior Java Kafka Developer
Location: Alpharetta, G .
Positions: 2
NEED LOCALS FOR IN-PERSON INTERVIEW
Job Overview:
We are seeking a skilled Java Developer with hands-on experience in Apache Kafka to design, develop, and maintain robust, scalable event-driven applications. The ideal candidate will have a strong foundation in Java, experience building streaming and messaging solutions with Kafka, and a passion for delivering high-quality software in a collaborative Agile environment.
Key Responsibilities:
Required Qualifications:
Location: Alpharetta, G .
Positions: 2
NEED LOCALS FOR IN-PERSON INTERVIEW
Job Overview:
We are seeking a skilled Java Developer with hands-on experience in Apache Kafka to design, develop, and maintain robust, scalable event-driven applications. The ideal candidate will have a strong foundation in Java, experience building streaming and messaging solutions with Kafka, and a passion for delivering high-quality software in a collaborative Agile environment.
Key Responsibilities:
- Design, develop, test, and maintain Java applications with Kafka-based messaging and streaming capabilities.
- Implement producer and consumer clients, stream processing, and event-driven architectures using Kafka.
- Collaborate with data engineers, backend developers, and QA to deliver reliable, scalable solutions.
- Design and build fault-tolerant and scalable data pipelines, ensuring data integrity and low latency.
- Monitor, profile, and optimize Java applications for performance and resource usage.
- Implement unit, integration, and end-to-end tests; participate in code reviews and CI/CD pipelines.
- Troubleshoot production issues, diagnose root causes, and implement durable fixes.
- Contribute to architecture and design discussions; stay current with Kafka best practices and evolving technologies.
- Develop and maintain comprehensive technical documentation.
Required Qualifications:
- Strong experience with Java (ideally Java 8+), including modern language features and best practices.
- Experience with cloud platforms (AWS, GCP, or Azure) and managed Kafka services (MSK, Confluent Cloud).
- Hands-on experience with Apache Kafka (producers, consumers, topics, partitions, offsets, consumer groups, exactly-once semantics).
- Familiarity with Kafka ecosystem components (Kafka Streams, KSQL/ksqldb, Schema Registry, Connect, or similar).
- Understanding of distributed systems, messaging patterns, and data serialization formats (JSON, Avro, Protobuf).
- Experience with RESTful APIs and integrating with external services.
- Comfortable with SQL and relational data modeling; exposure to NoSQL is a plus.
- Experience with version control (Git) and collaborative workflows (PR reviews, Agile/Scrum).
- Strong problem-solving skills, debugging abilities, and strong written/verbal communication.