Here is a step-by-step guide to create a Java Maven project for implementing Apache Kafka:
Note : Make sure to have Java & Maven install on your machine
1. Create a Maven project: Open your terminal or command prompt and navigate to the directory where you want to create the project. Run the command mvn archetype:generate -DgroupId=com.example -DartifactId=kafka-project -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false
. This will create a Maven project with the name kafka-project
in the directory com.example
.
2. Add Kafka dependencies to the pom.xml file: Open the pom.xml file in the project directory and add the following dependencies:
1 2 3 4 5 |
<dependency> <groupid>org.apache.kafka</groupid> <artifactid>kafka-clients</artifactid> <version>2.8.0</version> </dependency> |
3. Create a Kafka producer: In your project, create a new Java class for the producer. For example, you can name it KafkaProducerExample.java
. Add the following code to the class:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
import java.util.Properties; import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.Producer; import org.apache.kafka.clients.producer.ProducerRecord; public class KafkaProducerExample { public static void main(String[] args) { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("acks", "all"); props.put("retries", 0); props.put("batch.size", 16384); props.put("linger.ms", 1); props.put("buffer.memory", 33554432); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); Producer<String, String> producer = new KafkaProducer<>(props); for (int i = 0; i < 100; i++) { producer.send(new ProducerRecord<>("topic-name", Integer.toString(i), Integer.toString(i))); } producer.close(); } } |
4. Create a Kafka consumer: In your project, create a new Java class for the consumer. For example, you can name it KafkaConsumerExample.java
. Add the following code to the class:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
import java.time.Duration; import java.util.Arrays; import java.util.Properties; import org.apache.kafka.clients.consumer.ConsumerRecord; import org.apache.kafka.clients.consumer.ConsumerRecords; import org.apache.kafka.clients.consumer.KafkaConsumer; public class KafkaConsumerExample { public static void main(String[] args) { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("group.id", "test"); props.put("enable.auto.commit", "true"); props.put("auto.commit.interval.ms", "1000"); props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props); consumer.subscribe(Arrays.asList("topic-name")); while (true) { ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100)); for (ConsumerRecord<String, String> record : records) { System.out.println("offset = " + record.offset() + ", key = " + record.key() + ", value = " + record.value()); } } } } |
Note: In the example above, you will need to replace localhost:9092
with the actual address of your Kafka broker.
You can now run the producer and consumer classes to see the messages being produced and consumed in your Kafka cluster.