Are you just trying to get started on your first project with Kafka? I was there about two hours ago. While this information is still in my mind, I would like to get you started with Kafka as well.
This is meant to be an absolute beginner article! Partly because I don’t know more about this topic yet.
Let’s get started with why I wanted to explore Kafka in the first place. I have a bigger system that generates events and I want to create some notifications for those events (say for example: the turkey is ready in the oven). I want to create a Slack bot, and a Discord bot for sending these notifications to me, and to others using the system. I want to use Kafka because it seems like the perfect thing when “events” are the important entities (rather than database entries).
I did a lot of looking around to get started and then landed on a few good resources to build a very basic barebones (almost boilerplate) code. I will keep this short and clear and give you the steps that gave me an “MVP” to work with.
Create a Kafka broker
Firstly, I knew that I would need to run Kafka in a Docker container that would act as a server (or a broker in Kafka lingo?). I found that I must also run something called Zookeeper. This link gave me the best way to get started with this Kafka + Zookeeper setup using docker-compose.
The Kafka server is now setup! You don’t really need to do much on that side of things anymore.
Create a Producer
Kafka is a producer-consumer system. Producers create messages on “topics” and consumers read from those topics.
I created a producer on Node.js.
Remember to install the kafkajs package using yarn/npm as you like.
I did not need to have any username and passwords whatsoever because I have not set it up in the configs. I’m just racing to the finish line here. Excuse me for the security holes.
To send a message on a topic: “data”, simply create a producer object, connect the producer to the broker, and send the message. See the code below.
If you run this code, you should be able to send the message to the topic.
Now, you gotta listen to this “data” topic to receive these messages.
Create a consumer
The exercise is pretty similar.
Consumers run persistently to listen to messages on the topic they are listening to. Here is how the code for consumers looks like:
Most of the things are similar as you may observe. There is a “run” function that allows us to define a callback eachMessage that lets us do something with each message that comes in on the topic “data”.
I run the consumer and I got the message I sent earlier. Kafka does this thing where it makes sure you receive all the messages, so as it seems to me at the moment.
And that is your kickstart I hope!
If you want further help to put things together, here is the link for the Gist.