Open In App

Apache Kafka Consumer

Improve
Improve
Like Article
Like
Save
Share
Report

Kafka Consumers is used to reading data from a topic and remember a topic again is identified by its name. So the consumers are smart enough and they will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is again a good property of Apache Kafka. Now data for the consumers is going to be read in order within each partition. Now please refer to the below image. So if we look at a Consumer consuming from Topic-A/Partition-0, then it will first read the message 0, then 1, then 2, then 3, all the way up to message 11. If another consumer is reading from two partitions for example Partition-1 and Partition-2, is going to read both partitions in order. It could be with them at the same time but from within a partition the data is going to be read in order but across partitions, we have no way of saying which one is going to be read first or second and this is why there is no ordering across partitions in Apache Kafka

Apache Kafka Consumer

 

So our Kafka consumers are going to be reading our messages from Kafka which are made of bytes and so a Deserializer will be needed for the consumer to indicate how to transform these bytes back into some objects or data and they will be used on the key and the value of the message. So we have our key and our value and they’re both binary fields and bytes and so we will use a KeyDeserializer of type IntegerDeserializer to transform this into an int and get back the number 123 for Key Objects and then we’ll use a StringDeserializer to transform the bytes into a string and read the value of the object back into the string “hello world”. Please refer to the below image. 

 

So as we can see here choosing the right Deserializer is very important because if you don’t choose the right one then you may not get the right data in the end. So some common Deserializer is given below

  • String (Including JSON if your data is adjacent)I
  • Integer, and Float for numbers
  • Avro, and Protobuf for advanced kind of data

Apache Kafka Consumer Example

In this example, we will be discussing how we can Consume messages from Kafka Topics with Spring Boot. Talking briefly about Spring Boot, it is one of the most popular and most used frameworks of Java Programming Language. It is a microservice-based framework and building a production-ready application using Spring Boot, takes very less time. Spring Boot makes it easy to create stand-alone, production-grade Spring-based Applications that you can “just run“. So let’s start with the implementation.

Prerequisite: Make sure you have installed Apache Kafka in your local machine. Refer to this article How to Install and Run Apache Kafka on Windows?

Step 1: Go to this link and create a Spring Boot project. Add the “Spring for Apache Kafka” dependency to your Spring Boot project. 

Step 2: Create a Configuration file named KafkaConfig. Below is the code for the KafkaConfig.java file.

Java




// Java Program to Illustrate Kafka Configuration
 
package com.amiya.kafka.apachekafkaconsumer.config;
 
// Importing required classes
import java.util.HashMap;
import java.util.Map;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
 
// Annotations
@EnableKafka
@Configuration
 
// Class
public class KafkaConfig {
 
    @Bean
    public ConsumerFactory<String, String> consumerFactory()
    {
 
        // Creating a Map of string-object pairs
        Map<String, Object> config = new HashMap<>();
 
        // Adding the Configuration
        config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
                   "127.0.0.1:9092");
        config.put(ConsumerConfig.GROUP_ID_CONFIG,
                   "group_id");
        config.put(
            ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
            StringDeserializer.class);
        config.put(
            ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
            StringDeserializer.class);
 
        return new DefaultKafkaConsumerFactory<>(config);
    }
 
    // Creating a Listener
    public ConcurrentKafkaListenerContainerFactory
    concurrentKafkaListenerContainerFactory()
    {
        ConcurrentKafkaListenerContainerFactory<
            String, String> factory
            = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
}


Step 3: Create a Consumer file named KafkaConsumer

Java




// Java Program to Illustrate Kafka Consumer
 
package com.amiya.kafka.apachekafkaconsumer.consumer;
 
// Importing required classes
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Component;
 
@Component
 
// Class
public class KafkaConsumer {
 
    @KafkaListener(topics = "NewTopic",
                   groupId = "group_id",
                  containerFactory = "concurrentKafkaListenerContainerFactory")
 
    // Method
    public void
    consume(String message)
    {
        // Print statement
        System.out.println("message = " + message);
    }
}


Step 4: Now we have to do the following things in order to consume messages from Kafka topics with Spring Boot

  • Run the Apache Zookeeper server
  • Run the Apache Kafka  server
  • Send the messages from Kafka Topics

Run your Apache Zookeeper server by using this command

C:\kafka>.\bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties

Similarly, run your Apache Kafka server by using this command

C:\kafka>.\bin\windows\kafka-server-start.bat .\config\server.properties

Run the following command to send the messages from Kafka Topics

C:\kafka>.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic NewTopic

Step 5: Now run your spring boot application. Make sure you have changed the port number in the application.properties file

server.port=8081

Let’s run the Spring boot application inside the ApacheKafkaConsumerApplication file

Output: In the output, you can see when you are sending the message from Kafka Topics it is displayed on the console in real-time. 

Output



Last Updated : 12 Dec, 2022
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads