Logo
  • Products
    • SecOps Studio
  • Solutions
    • Pega Stack Shifter
  • Services
    • Pega Enablement
    • Pega Modernization
  • About
  • Contact
  • Blog
Code Vault

Kafka – Part 2 – Server Setup and CLI tutorial

December 10, 2024 Code Vault Curators

In this blog article, we will set up Kafka in Windows and start producers and consumers using CLI. We will also see how the messages get consumed.

It is recommended to go through previous blog article on Kafka fundamentals before proceeding here.

Kafka – Part 1 – Fundamentals
Let’s straightaway start downloading the binaries.

Step 1: Make sure you have Java JDK 8 is installed on your local machine.

If not, please follow some Google links to download and install Java JDK 8 on your local machine.

Step 2: Download the Kafka binaries.

https://kafka.apache.org/downloads

The latest version as of today is 2.5.0.

Click on the scala binary downloads.

Then click on the mirror site for download.

Step 3: Extract the downloaded binaries in your local drive.

Step  4: Verification.

Let’s do some verification that everything is setup correctly.

Open your command prompt and change the directory location to Kafka bin folder

cd C:kafka_2.13-2.5.0bin

Type java –version to verify if right Java version is installed.

Open the Windows folder inside the bin folder. You will list of bat files to execute.

Just try executing any bat command to see if it works.

I just tried, kafka-configs.bat. It works.

Tip: Since all the bat files are in windows folder, which we need to, navigate more often to execute commands, specify the path location in environment variables.

How to start Zoopkeeper & Kafka?

Step 1: Set up data folders for zookeeper and Kafka.

It is recommended to create dedicated data folders in your machine for Zookeeper and Kafka.

I created a new folder called data and created two new sub-folders – Kafka and Zookeeper inside the data folder.

Step 2: Edit the Zookeeper configuration file.

Switch to the config directory and open the zookeeper.properties file in Notepad

Specify the newly created data directory.

Step 3: Start the zookeeper.

Run the bat file by providing the zookeeper.properties file as arguments

zookeeper-server-start.bat C:kafka_2.13-2.5.0configzookeeper.properties

Make sure your zookeeper is started by binding to the port.

I got an error saying that port  2181 is already in use!!!

The reason is that my Pega personal edition is already running on my local machine and so the Kafka occupies that default port.

For now, I am not going to use Pega, so I shut down the server and execute the same command again. This time I am able to successfully bind to 2181 port 🙂

Your zookeeper is up and running now.

Step 4: Edit the server.properties file.

Switch to the config directory and open the server.properties file in Notepad

Specify the newly created data directory.

Save the file

Step 5: Start the Kafka.

Open a new command prompt.

Run the bat file by providing the server.properties file

kafka-server-start.bat C:kafka_2.13-2.5.0configserver.properties

You should see the success message that Kafka has started.

We see a new cluster ID and broker.id as 0.

You can switch to the log directory and see some file content.

Kafka Command line Interface – CLI

We will see how we can create topics, start producer and consumer from command line tool.

Kafka topics CLI –

Just type kafka-topics to see a list of options you can use with kafka-topics.

You can create, delete, and describe topics.

Note: Going forward I purposefully make some mistakes when executing commands, so that we learn the mandatory arguments/ options for each command.

Let’s start with creating a new topic

Type kafka-topics –create

You see arguments are missing – bootstrap server or zookeeper must be specified.

Let’s specify it.

kafka-topics –create –zookeeper 127.0.0.1:2181

Now, missing argument – topic. Yes, you should specify a topic name right?

Try adding the mandatory options one by one on your own! Fail fast and learn fast!!

Finally, a topic is successfully created using the below command.

kafka-topics –create –zookeeper 127.0.0.1:2181 –topic myknowpega_first –partitions 5 –replication-factor 1

 

Let’s check the created topics list

kafka-topics –list –zookeeper 127.0.0.1:2181

In the data directory, you can also see the 5 partitions for the newly created topic.

We will see the contents of this partition folder in the later posts.

One more option –describe can be used to look at the topic in a more detailed way.

kafka-topics –zookeeper 127.0.0.1:2181 –topic myknowpega_first –describe

You can now start creating a second topic on your own.

Kafka producer CLI –

Open a new command prompt.

Type the command kafka-console-producer or .bat file – kafka-console-producer

Again missing parameters.

You will see there are two mandatory parameters – bootstrap-server and topic name.

kafka-console-producer –bootstrap-server 127.0.0.1:9092 –topic myknowpega_first

There you see a carrot sign to enter the input message into Kafka

I entered 4 new messages.

You can play around with stopping your broker, sending acks etc.

Kafka consumer CLI –

Open a new command prompt.

Type the command – kafka-console-consumer –bootstrap-server 127.0.0.1:9092 –topic myknowpega_first

You see the consumer is started.

Note: by default, Kafka consumers can consume new messages. That is why you don’t see the old messages.

Now I am going to test the live message from the producer console.

There you see the same message produced by the producer console gets consumed by the consumer console.

Play with the messages.

Use –from-beginning to get all the messages from the beginning. There you go.

Let’s start with creating a new consumer group

Step 1: Stop all the console consumer running.

Step 2: Start a new consumer using the group name argument

kafka-console-consumer –bootstrap-server 127.0.0.1:9092 –topic myknowpega_first –group myknowpega_app

Step 3: Start another consumer using the same group name.

Step 4: Start producing the message from the producer console.

I produced 3 messages and those messages are equally received by both the consumers.

You can try starting as many consumers as you want and try testing the messages.

One final command before closing this CLI tutorial. Describe the consumer group

kafka-consumer-groups –bootstrap-server 127.0.0.1:9092 –describe –group myknowpega_app

You see first 3 partitions belong to one consumer ID and the last 2 partitions belong to the second consumer ID.

You can add consumers and describe the group again to see the differences.

As a summary,

– We downloaded the binaries for Kafka Windows

– Using CLI, started the zookeeper followed by Kafka broker.

– We started a producer console and produced a few messages.

– Then we started the consumer console and read the same published messages.

– Finally, we also created a consumer group and saw how the message is received by multiple consumers.

In the second part of the series, we will see how Pega can make a connection with this Kafka broker and consume the messages.

 

  • pega-integration
Code Vault Curators

A technical team dedicated to empowering the Pega ecosystem with in-depth knowledge, guided by Premkumar Ganesan's vision.

Post navigation

Previous
Next

Pega Courses

Pega courses can be accessed at https://myknowacademy.com

Search through the blog

Tags

activity authentication background-processing case-management data-model declarative-processing email-processing file-processing pega-core-concepts pega-integration process reporting security system-administration user-interface validation

Categories

  • Code Vault

Recent posts

  • Service REST – Usage and Configurations in Pega
  • Queue processor – Usage and Configurations
  • Data Pages Usage & Configurations in Pega
  • Requestor types in Pega
  • Case Locking Mechanism in Pega

Related Articles

Code Vault

Service REST – Usage and Configurations in Pega

December 18, 2024 Code Vault Curators

First of all, don’t think INTEGRATION is a difficult topic in PEGA. I would say Integration is much easier than creating an activity rule. Pega makes it simple for developers. All you need to do is configure the Integration rules. Pega takes care of the rest. Pega provides useful wizards and guides the developers in […]

Code Vault

Service SOAP – Usage and Configurations in Pega

December 10, 2024 Code Vault Curators

In this blog article, we will see how a Pega application can host a SOAP web service. I request you to go through another blog article on Service-REST where I already talked about services in general – Update: The Concept of SOAP services remains the same across different Pega versions. The screenshots in this blog […]

Code Vault

Connect SOAP – Usage and Configurations in Pega

December 10, 2024 Code Vault Curators

In this blog article, we will try to understand about Connect-SOAP in Pega. Update: The concept of SOAP Connector remains the same across different Pega versions. Most of the screenshots in this blog article were reused from Pega 7 version and few were updated. Use this blog article to learn the concepts and you can […]

Code Vault

Connect REST – Usage and Configurations in Pega

December 10, 2024 Code Vault Curators

In this blog article, we will try to understand about Connect-REST in Pega. Connect-REST is One of the most commonly used Connectors in Pega to integrate with external systems. Update: The concept of REST Connector remains the same across different Pega versions. Most of the screenshots in this blog article were reused from Pega 7 […]

About

MyKnowTech was born with a mission to bridge the gap between technical expertise and business needs. We are a boutique firm specializing in Pega solutions, delivering them with a personal touch. At the heart of our philosophy is a commitment to putting clients first.

Company
  • About
  • Leadership
  • Career
  • Contact
Resources
  • Blog
  • Services
  • Solutions
  • Insights

©  MyKnowTech B.V. All Rights Reserved.

  • Sitemap
  • Terms & Conditions
  • Privacy Policy