Open In App

Kafka Producer CLI Tutorial

Last Updated : 09 Feb, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

Real-time data pipelines and streaming applications are created using the distributed streaming platform Apache Kafka. A client application that writes data to a Kafka cluster is referred to as a “producer” in this system. Data from the producer is sent to Kafka in the form of topic-organized messages. Producers can send messages to certain topics, while consumers can subscribe to specific topics. Each message is assigned to a topic by the producer, who is also in charge of adding it to the Kafka cluster. Consumers can read and process the messages after they have been stored by the Kafka cluster for a certain amount of time. As long as they have access to the Kafka client library for that language, Kafka producers can be created in any language. The producer can connect to a Kafka cluster, transmit messages to it, and receive responses from the cluster using the APIs provided by the Kafka client library.

How to Produce a Message into a Kafka Topic using the CLI?

To produce messages for a Kafka topic using the Kafka console producer, you will need to provide the following mandatory parameters:

  • The Kafka hostname and port: You can use the –bootstrap-server option to specify the hostname and port of your Kafka cluster (e.g., localhost:9092). If you are using an older version of Kafka that does not support the –bootstrap-server option, you can use the –broker-list option instead.
  • The topic name is: You can use the –topic option to specify the name of the topic that you want to produce messages for.

For example, the following command will start the Kafka console producer and send messages to the my-topic topic on a Kafka cluster running on localhost:9092:

Kafka-console-producer.sh –bootstrap-server localhost:9092 –topic my-topic

Once the producer is running, you can type messages into the console and press Enter to send them to the Kafka cluster. The messages will be added to the my-topic topic and made available for consumption by Kafka consumers.

After starting the Kafka console producer, you should see a > sign, which indicates that the producer is ready to receive input. Any line of text that you type after the > sign and press Enter will be treated as a message and sent to the Kafka cluster.

You can then type a message into the console and press Enter to send it to the my-topic topic. The message will be added to the Kafka cluster and made available for consumption by Kafka consumers.

Example:

>This is my first message.

Output:

Kafka Producer CLI

 

In the above example, messages were generated for the “my-topic” topic using the kafka-console-producer.sh command.

Gotchas

  • The Kafka-console-producer.sh command is a command-line tool included with Apache Kafka that allows you to produce messages to a Kafka topic. By default, the messages are sent with a null key.

You can specify a different key by using the -key option, like this:

kafka-console-producer.sh –bootstrap-server localhost:9092 –topic my-topic -key my-key

  • In addition to specifying a key, you can also specify other options when producing messages, such as the compression type to use (with the -compression-type option) and the serializer to use for the key and value (with the -key-serializer and -value-serializer options).
  • Another thing to keep in mind when using the kafka-console-producer.sh command is that, if the topic you specify does not exist, it will be created by Kafka. The number of partitions and the replication factor for the topic will be the default values, which are usually 1 and 2, respectively. This behavior can be changed by configuring the auto-create topic on the Kafka config.

You can also configure the number of partitions and replication factor for a topic when you create it, you can use the kafka-topics.sh command for this, like this:

kafka-topics.sh –create –bootstrap-server localhost:9092 –replication-factor 3 –partitions 6 –topic my-topic

Extra Important options you can set (advanced)

These are some additional options that you can use with the Kafka console producer to customize its behavior. Here is a brief explanation of each option:

  1. –compression-codec: This option allows you to enable message compression for the messages that you produce. By default, the Kafka console producer uses gzip compression, but you can use this option to specify a different codec, such as ‘none‘, ‘snappy‘, ‘lz4‘, or ‘zstd‘.
  2. –producer-property: This option allows you to pass in any producer property as a key-value pair. For example, you can use this option to set the acks property, which controls the number of replicas that need to acknowledge receipt of a message before the producer considers it to have been successfully sent.
  3. –request-required-acks: This option is an alternative to the –producer-property option for setting the acks property. It allows you to specify the number of replicas that need to acknowledge receipt of a message before the producer considers it to have been successfully sent.

For example, the following command will start the Kafka console producer and set the acks property to all, which means that all replicas in the Kafka cluster must acknowledge receipt of a message before it is considered to have been successfully sent:

kafka-console-producer.sh –bootstrap-server localhost:9092 –topic my-topic –producer-property acks=all

How to produce messages from a file with the Kafka Console Producer CLI?

Example:

The file should look like this and make sure each message is on a new line.

example.txt

 

Generate messages related to the topic using the information from the file, (as indicated in the command’s final instructions)

kafka-console-producer.sh –bootstrap-server localhost:9092 –topic my-topic < example.txt

Produce messages from a file.

 

Output:

Kafka consumer received messages from the producer.

 

The above example demonstrates the production of messages from a text file. To begin, a text file must be created and messages must be added, with each message on a new line. By utilizing the kafka-console-consumer.sh command, all messages from the text file can be produced and sent to the “my-topic” topic.

How to produce messages with the key in the Kafka Console Producer CLI

By default, messages sent to a Kafka topic will have null keys. To send messages with keys, you can use the –property option of kafka-console-producer.sh command.

The parse.key and key.separator properties can be used to specify the key for each message. The parse.key property should be set to true, and the key.separator property should be set to the separator used to separate the key and value in the input.

Example:

If you want to send messages to the topic “my-topic“, with keys and values separated by a colon, you can use the following command:

kafka-console-producer.sh –bootstrap-server localhost:9092 –topic my-topic –property “parse.key=true” –property “key.separator=:”

Output:

Produce messages with key and value

 

It is worth noting that, In this way, the producer will parse input text line by line and it will use the first key.separator found as the separator between key and value and that separator will be removed from the input

Please be aware that, changing the key separator in Kafka doesn’t impact the messages that are already produced and stored in Kafka.

Conclusion

In conclusion, Producers are client applications that write data to a Kafka cluster, and they send messages to specific topics. Consumers can subscribe to these topics and read the messages that are produced. The producer assigns each message to a topic and adds it to the Kafka cluster. Consumers can read and process the messages after they have been stored by the Kafka cluster for a certain amount of time. To produce messages for a Kafka topic using the Kafka console producer, you need to provide the hostname and port of your Kafka cluster, and the name of the topic you want to produce messages for using the –bootstrap-server and –-topic options respectively. You can also specify different keys, compression types, and serializers for the key and value.



Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads