Kafka is a populair opensource, distributed streaming, platform which prides itself on key features as fault-tolerance and replay options. My colleague Jan van Zoggel wrote a nice “getting started” blog post about kafka which can be found here.
In this blogpost I will show you, in some very easy steps, how you can start producing and consuming kafka messages with apache Camel.
First of all you need to install Kafka itself (https://kafka.apache.org/quickstart). After you have started the kafka service you need to create a topic.
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic theBigKafkaTestTopic
When the topic is created you can start producing messages on the topic. Add the dependency to your pom file.
<dependency> <groupId>org.apache.camel</groupId> <artifactId>camel-kafka</artifactId> </dependency>
The following code snippet shows how to send a message to a kafka topic. Note that a header is set (KafkaConstants.KEY) this is the unique identifier for the kafka message.
from("timer:trigger") .transform().simple("ref:myBean") .setHeader(KafkaConstants.KEY,simple("bean:generateUUID?method=getGuid") ) .log("${header.kafka.KEY}") .to("kafka:localhost:9092?topic=theBigKafkaTestTopic");
Consuming can be done by adding the configuration url to your from component.
from("kafka:localhost:9092?topic=theBigKafkaTestTopic&groupId=testing&autoOffsetReset=earliest&consumersCount=1") .log("${body}") .end();