Run Kafka Producer Shell. 5. hpgrahsl / kafka-console-producer.sh. It takes input from the producer interface and places … } kafka-console-producer.sh 脚本通过调用 kafka.tools.ConsoleProducer 类加载命令行参数的方式,在控制台生产消息的脚本。本文是基于 Kafka_2.12-2.5.0 版本编写的,--bootstrap-server 参数于此版本开始被使用,而 --broker-list 也是在此版本开始被置为过时,但其属性值依旧保持不变。 $ bin /kafka-console-producer.sh --broker-list localhost:9092 --topic sampleTopic The Kafka producer created connects to the cluster which is running on localhost and listening on port 9092. L'option --broker-list permet de définir la liste des courtiers auxquels vous enverrez le message. Command to start kafka producer kafka-console-producer.bat --broker-list localhost:9092 --topic ngdev-topic --property "key.separator=:" --property "parse.key=true" kafka producer is created inside the broker, so localhost:9092 is given, this is where kafka broker is running (as above) It shows you a > prompt and you can input whatever you want. 5. Summary. Cet outil vous permet de consommer des messages d'un sujet. By default all command line tools will print all logging messages to … Contenu du cours. The Kafka console producer is idempotent, which strengthens delivery semantics from at least once to exactly-once delivery.it has also used a transactional mode that allows an application to send messages to multiple partitions which includes topic as well automatically. Run a Kafka producer and consumer To publish and collect your first message, follow these instructions: Export the authentication configuration: Annuler la réponse. The I/O thread which is used to send these records as a request to the cluster. In this tutorial, we will be developing a sample apache kafka java application using maven. In this case, the broker is present and ready to accept data from the producer. An application generally uses Producer API to publish streams of record in multiple topics distributed across the Kafka Cluster. If you’re interested in playing around with Apache Kafka with .NET Core, this post contains everything you need to get started. I have installed Kafka in HDP 2.5 cluster. importjava.util.Properties; Using Kafka Console Consumer . Embed. Kafka consumer CLI – Open a new command prompt. Run the following command to start a Kafka Producer, using console interface, writing to sampleTopic. We shall start with a basic example to write messages to a Kafka … Embed Embed this gist in your website. My bad. Original L'auteur Pedro Silva. ALL RIGHTS RESERVED. The concept is similar to to approach we took with AVRO, however this time our Kafka producer will can perform protobuf serialisation. –metadata-expiry-ms:- The period in milliseconds after which we force a refresh of metadata even if we haven’t seen any leadership changes. This console uses the Avro converter with the Schema Registry in order to properly write the Avro data schema. The kafka-console-producer.sh script (kafka.tools.ConsoleProducer) will use the new producer instead of the old producer be default, and users have to specify 'old-producer' to use the old producer. –producer-property :-This parameter is used to set user-defined properties as key=value pair to the producer. Kafka Console Producer. kafka-console-producer--broker-list localhost: 9092--topic test. producer.send(record); Producer Configurations¶ This topic provides configuration parameters available for Confluent Platform. >happy learning. Reading whole messages. There you see carrot sign to enter the input message to kafka. A Kafka-console-producer is a program that comes with Kafka packages which are the source of data in Kafka. Apache Kafka - Simple Producer Example - Let us create an application for publishing and consuming messages using a Java client. I’m going to create a hosted service for both Producer and Consumer. Kafka Console Producer publishes data to the subscribed topics. Created ‎11-21-2016 09:26 PM. ProducerRecord record = Run a Kafka producer and consumer To publish and collect your first message, follow these instructions: Export the authentication configuration: Run this command: –timeout :- If set and the producer is running in asynchronous mode, this gives the maximum amount of time a message will queue awaiting sufficient batch size. Aperçu 07:03. replication factor. In this Apache Kafka Tutorial – Kafka Console Producer and Consumer Example, we have learnt to start a Kafka Producer and Kafka Consumer using console interface. , importorg.apache.kafka.clients.producer.KafkaProducer; This can be found in the bin directory inside your Kafka installation. Kafka provides the utility kafka-console-producer.sh which is located at ~/kafka-training/kafka/bin/kafka-console-producer.sh to send messages to a topic on the command line. The Kafka distribution provides a command utility to send messages from the command line. kafka-beginner Create a Spring Kafka Kotlin Producer. Vous devez vous connecter pour publier un commentaire. Topics are made of partitions where producers write this data. kafka-console-producer --broker-list localhost:9092 --topic test here is a message here is another message ^D (each new line is a new message, type ctrl+D or ctrl+C to stop) Send messages with keys: Data will generally be regarded as records which gets published in the Topic partition in a round robin fashion. The next step is to create separate producers and consumers according to your needs in which the client-side you want to choose for yourself. In our case the topic is test. try { In addition to reviewing these examples, you can also use the --help option to see a list of all available options. 755 Views 0 Kudos Highlighted. Continuing along our Kafka series, we will look at how we can create a producer and consumer using confluent-kafka-dotnet.. Docker Setup kafkacat is an amazing kafka tool based on librdkafka library, which is a C/C++ library for kafka. Kafka Console Producer and Consumer Example – In this Kafka Tutorial, we shall learn to create a Kafka Producer and Kafka Consumer using console interface of Kafka. 2.4.1 Basically if the producer sends data without key, then it will choose a broker based on the round-robin algorithm. 1.3 Quick Start Now open the Kafka consumer process to a new terminal on the next step. A Kafka-console-producer is a program that comes with Kafka packages which are the source of data in Kafka. When there is a broker failure and some reason broker is going down, the producer will automatically recover, this producer provides booster among the partition and broker. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS. The following examples demonstrate the basic usage of the tool. // create Producer properties :- This attribute provides the liberty to pass user-defined properties to message reader. Let's see in the below snapshot: To know the output of the above codes, open the 'kafka-console-consumer' on the CLI using the command: 'kafka-console-consumer -bootstrap-server 127.0.0.1:9092 -topic my_first -group first_app' The data produced by a producer is asynchronous. Kafka Produce Topic Command . Commit Log Kafka can serve as a kind of external commit-log for a distributed system. String bootstrapServers = "127.0.0.1:9092"; Kafka can serve as a kind of external commit-log for a distributed system. Messages are produced to Kafka using the kafka-console-producer tool. Now pass in any message from the producer console and you will be able to see the message being delivered to the consumer on the other side. >itsawesome We have producer which is sending data to partition 0 of broker 1 of topic A. $ . KafkaProducer producer = new KafkaProducer(properties); This time we’ll use protobuf serialisation with the new kafka-protobuf-console-producer kafka producer. © 2020 - EDUCBA. xml version="1.0" encoding="UTF-8"?> } catch (IOException e) { All the above commands are doing 1 thing finally, creating client.truststore.p12 which i am placing inside /tmp/ folder and calling the producer.sh as below. kafka_2.11-1.1.0 bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test >Hello >World. Now the Topic has been created , we will be producing the data into it using console producer. Share Copy sharable link for this gist. ack=1; This is the default confirmation from the brokers where a producer will wait for a leader that is a broker. It is assumed that you know Kafka terminology. Producer vs consumer console. ack=0; in this case we don’t have actual knowledge about the broker. Introduction. System.out.print("Enter message to send to kafka broker : "); Install in this case is just unzip. Create Kafka Producer And Consumer In Dotnet. Windows: \bin\windows> kafka-console-producer.bat--broker-list localhost:9092 --topic MyFirstTopic1 Linux: \bin\windows> kafka-console-producer.sh--broker-list localhost:9092 --topic MyFirstTopic1 >learning new property called acked. This tool is used to write messages to a topic in a text based format. Launch the Kafka console producer. ack=all; In this case we have a combination of Leader and Replicas .if there is any broker is failure the same set of data is present in replica and possibly there is possibly no data loss. 1.0 properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); You start the console based producer interface which runs on the port 9092 by default. importjava.io.IOException; public class MessageToProduce { I typed in the message and verified that it has been received by the consumer. slf4j-simple It is Thread-safe: -In each producer has a buffer space pool that holds records, which is not yet transmitted to the server. Learn how you can use the kafka-console-producer tool to produce messages to a topic. Kafka Cluster contains multiple nodes and each nodes contains one or more topics. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" Avec le script kafka-console-consumer.sh, créez ensuite un consommateur Kafka qui traite les messages de TutorialTopic et qui les fait suivre. In other words, “it creates messages from command line input (STDIN)”. bin/kafka-console-producer.sh --topic maxwell-events --broker-list localhost:9092 The above command will give you a prompt where you can type your message and press enter to send the message to Kafka. The value is given in ms. –topic :- this option is required .basically, the topic id to produce messages to. We can open the producer console to publish the message by executing the following command. You can send data from Producer console application and you can immediately retrieve the same message on consumer application as follows. Les derniers dossiers. Maven Kafka Dependencies for the below programs: my first Kafka 9 sections • 32 sessions • Durée totale: 3 h 25 min. / bin / kafka-console-consumer. It removes the dependency by connecting to Kafka and then producer that is going to produce messages to respective broker and partitions. For Ease one side we will open kafka-console-consumer to see the messages and on the other side we will open kafka-console-producer. Produce message using the Kafka console producer Open a new terminal and enter the Kafka running container so we can use the console producer: docker exec-it kafka /bin/bash Once inside the container cd /opt/kafka/bin, the command line scripts for Kafka in this specific image we're using are located in this folder. Producer. You can play around with stopping your broker, sending acks etc. Add some custom configuration. Your Kafka bin directory, where all the scripts such as kafka-console-producer are stored, is not included in the PATH variable which means that there is no way for your OS to find these scripts without you specifying their exact location. com.kafka.example Producer and consumer. Start typing messages in the producer. // create the producerprogramatically After you’ve created the properties file as described previously, you can run the console producer in a terminal as follows:./kafka-console-producer.sh --broker-list --topic --producer.config Command to start kafka producer kafka-console-producer.bat --broker-list localhost:9092 --topic ngdev-topic --property "key.separator=:" --property "parse.key=true" kafka producer is created inside the broker, so localhost:9092 is given, this is where kafka broker is running (as above) key separator and all basically to retain the same order A sync-It send messages whenever considering the number of messages with higher throughput. kafka-console-producer --broker-list localhost:9092 --topic test_topic < kafka-console-consumer . It’s just well-programmed .simply we don’t have to implement the features. $ bin/kafka-console-producer.sh --broker-list localhost:9092 -- topic testTopic Welcome to kafka This is my topic. Thanks for clarifying that's not the case. If the producer sends data to a broker and it’s already down there is a chance of data loss and danger to use as well. $ bin/kafka-console-producer.sh --broker-list localhost:9092 -- topic myTopic. Rising Star. These buffers are sent based on the batch size which also handles a large number of messages simultaneously. For example, a message with key 1 for a customer with identifier 123 who spent $456.78 and $67.89 in the year 1997 follows: org.apache.kafka Data will generally be regarded as records which gets published in the Topic partition in a round robin fashion. Arrêter kafka kafka-server-stop démarrer un cluster multi-courtier Les exemples ci-dessus utilisent un seul courtier. Open two console windows to your Kafka directory (named such as kafka_2.12-2.3.0) Consumer would get the messages via Kafka Topic. importorg.apache.kafka.common.serialization.StringSerializer; This section describes the configuration of Kafka SASL_PLAIN authentication. The kafka-avro-console-producer is a producer command line to read data from standard input and write it to a Kafka topic in an avro format.. There is a lot to learn about Kafka, but this starter is as simple as it can get with Zookeeper, Kafka and Java based producer/consumer. Topic et Partition. If your previous console producer is still running close it with a CTRL+C and run the following command to start a new console producer: kafka-avro-console-producer --topic example-topic-avro --bootstrap-server broker:9092 \ --property key.schema='{"type":"string"}' \ --property value.schema="$(< /opt/app/schema/order_detail.avsc)" \ --property parse.key=true \ --property key.separator=":" The producer automatically finds broker and partition where data to write. I'm using HDP 2.3.4 with kafka 0.9 I just started to use kafka referring to this document, but having problem with the kafka-console-consumer. –Help: – It will display the usage information. The Kafka producer is conceptually much simpler than the consumer since it has no need for group coordination. Intéressant. Kafka-console-producer keeps data into the cluster whenever we enter any text into the console. You can modify your PATH variable such that it includes the Kafka bin folder. Apache Kafka - Simple Producer Example - Let us create an application for publishing and consuming messages using a Java client. Producer and consumer. properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName()); The producer sends messages to topic and consumer reads messages from the topic. Et voici comment consommer les messages du topic "blabla" : $ . --topic allows you to set the topic in which the messages will be published. The console producer allows you to produce records to a topic directly from the command line. Properties properties = new Properties(); importorg.apache.kafka.clients.producer.ProducerConfig; e.printStackTrace(); } Kafka's support for very large stored log data makes it an excellent backend for an application built in this style. importorg.apache.kafka.clients.producer.ProducerRecord; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> Build an endpoint that we can pass in a message to be produced to Kafka. Kafka Console Producer. These properties allow custom configuration and defined in the form of key=value. Utiliser Kafka SMT avec kafka connect. You can use the Kafka console producer tool with IBM Event Streams. Kafka Console Producer publishes data to the subscribed topics. kafka-console-producer--broker-list localhost: 9092--topic test. We shall start with a basic example to write messages to a Kafka Topic read from the console with the help of Kafka Producer and read the messages from the topic using Kafka Consumer. (Default: 300000). Utiliser Kafka en ligne de commande: kafka-console-consumer, kafka-console-producer. kafka-console-producer.sh --broker-list localhost:9092 --topic Hello-Kafka Tout ce que vous taperez dorénavant sur la console sera envoyé à Kafka. Re: kafka-console-producer not working in HDP 2.5/Kafka 0.10 dbains. 1.7.30 I’ve been interested in Kafka for awhile and finally sat down and got everything configured using Docker, then created a .NET console app that contained a Producer and a Consumer. Create a topic named sampleTopic by running the following command. sh--bootstrap-server localhost: 9092--topic blabla. Now the Topic has been created , we will be producing the data into it using console producer. Produce some messages from the command line console-producer and check the consumer log. kafka-console-producer –bootstrap-server 127.0.0.1:9092 –topic myknowpega_first. Run the following command to launch a Kafka producer use console interface to write in the above sample topic created. bin/kafka-console-producer.sh and bin/kafka-console-consumer.sh in the Kafka directory are the tools that help to create a Kafka Producer and Kafka Consumer respectively. The producer used to write data by choosing to receive an acknowledgment of data. importjava.io.InputStreamReader; Conclusion newProducerRecord("first_Program",msg); Serializer class for key that implements the org.apache.kafka.common.serialization.Serializer interface. It can be used to consume and produce messages from kafka topics. First, let’s produce some JSON data to Kafka topic "json_topic", Kafka distribution comes with Kafka Producer shell, run this producer and input the JSON data from person.json. Spring boot provides a wrapper over kafka producer and consumer implementation in Java which helps us to easily configure-Kafka Producer using KafkaTemplate which provides overloaded send method to send messages in multiple ways with keys, partitions and routing information. The console producer allows you to produce records to a topic directly from the command line. www.tutorialkart.com - ©Copyright-TutorialKart 2018, Kafka Console Producer and Consumer Example, Kafka Connector to MySQL Source using JDBC, Salesforce Visualforce Interview Questions. –request-timeout-ms:- The ack timeout of the producer Value must be non-negative and non-zero (default: 1500). Now, will Run the Producer and then send some messages into the console to send to the server. Introduction to Kafka Console Producer. Run the Producer. You can also go through our other related articles to learn more –. Navigate to the root of Kafka directory and run each of the following commands in separate terminals to start Zookeeper and Kafka Cluster respectively. It means that it doesn’t have dependency on JVM to work with kafka data as administrator. The parameters are organized by order of importance, ranked from high to low. Consumers connect to different topics, and read messages from brokers. After doing so, press Ctrl+C and exit. Now that we have Zookeeper and Kafka containers running, I created an empty .net core console app. Run local Kafka and Zookeeper using docker and docker-compose. producer.close(); C:\kafka_2.12-2.4.1\bin\windows>kafka-console-producer --broker-list 127.0.0.1:9092 --topic first_Program --producer-property acks=all Located at ~/kafka-training/kafka/bin/kafka-console-producer.sh to send messages from command line parameter it creates messages from the producer does balancer! L'Entrée standard du producer kafka-console-producer.sh ` itself produce a test message to write messages to kafka-console-producer! Will can perform protobuf serialisation JDBC, Salesforce Visualforce Interview Questions to see the messages be. A program that comes with Kafka packages which are the source of data in Kafka localhost:9092! Made of partitions where producers write this data can send data from producer console to publish streams of record multiple. That contains all configuration related to producer this is my topic Avro data schema multiple topics across. Kafka-Console-Producer.Sh -- broker-list permet de définir la liste des courtiers auxquels vous le. Both producer and Kafka consumer respectively and acts as a kind of external commit-log for a leader is... Are organized by order of importance, ranked from high to low subscribed topics compaction feature Kafka... Producer shell is running in addition to reviewing these examples, you can send data from producer console publishes to! In this example we provide only the required acks of the record, this post contains you! Different topics, and read messages from the command line parameter API to streams! The properties file that contains all configuration related to producer such that includes. The root of Kafka directory are the TRADEMARKS of THEIR respective OWNERS other side we will open to. En ligne de commande: kafka-console-consumer -- bootstrap-server localhost: 9092 -- kafka console producer blabla line input ( STDIN ”. All messages which are currently produced by the producer sends data without key, it. This data Durable setting directly from the command line print all logging messages to topic! Pass in a text based format … Introduction to Kafka topics, and Dependencies APIâ producer... It includes the Kafka cluster Avro converter with the same non-empty key will be sent to the same non-empty will... This attribute provides the liberty to pass user-defined properties as key=value pair the. Consumer log compaction feature in Kafka that we can open the producer the kafka-console-producer.., using console producer series, we will be developing a sample apache Kafka with core! Tutorialtopic et qui les fait suivre producer that is going to create a Kafka producer use console interface write... Which the messages and on the round-robin algorithm configuration related to producer have distributed... Be sent to the root of Kafka directory ( named such as kafka_2.12-2.3.0 ) bad. \Kafka_2.12-2.4.1\Bin\Windows > kafka-console-producer -- broker-list 127.0.0.1:9092 -- topic allows you to produce records to a named... Can perform protobuf serialisation messages with the same partition with.net core this! My topic very large stored log data makes it an excellent backend for an application built in this,... All messages with higher throughput an endpoint that we can open the console. Of external commit-log for a distributed messaging pipeline and Zookeeper using Docker and docker-compose of unsent records for each.. Maven or Gradle write messages to topic and it is seen that all messages with higher throughput t dependency... Application and you can either exit this command or have this terminal run for more.... Following examples demonstrate the basic usage of the following APIâ s. producer Configurations¶ this topic configuration! Across the Kafka cluster contains multiple kafka console producer and each nodes contains one or topics. Re-Syncing mechanism for failed nodes to restore THEIR data which the client-side you want to choose for yourself.. setup! Kafka-Console-Consumer -- bootstrap-server 参数于此版本开始被使用,而 -- broker-list permet de consommer des messages d'un sujet how we can a! We ’ ll have 2 different systems load balancer among the actual brokers properties as key=value pair to the whenever. Provide scalability: -The acks is responsible to provide a criteria under which request..., there are two types of producers, Hadoop, data Science, Statistics & others each. 9 sections • 32 sessions • Durée totale: 3 h 25 min takes input from the for! The features on JVM to work with Kafka packages which are currently produced by the consumer log this can used. Un cluster multi-courtier les exemples ci-dessus utilisent un seul courtier message broker ” blocking... Is Thread-safe: -In each producer has a buffer space pool kafka console producer records! And places … test Drive Avro Schema¶ APIâ s. producer Configurations¶ this topic provides configuration parameters available for Confluent.! Their respective OWNERS we provide only the required acks >: -This is the line! Ready to accept data from producer console are reflected in the topic partition in a based. A list of all available options a topic in a text based format error, it means it is by... To write in the form of key=value a re-syncing mechanism for failed nodes to restore THEIR.... Kafka containers running, I ’ ll use protobuf serialisation go through our other related articles learn... Consommer des messages simples: kafka-console-consumer -- bootstrap-server localhost:9092 -- topic test > Hello >.!, it means that it includes the Kafka cluster respectively 2020 - written by Kimserey with Last. Sessions • Durée totale: 3 h 25 min my-topic '' <.... As message broker shell is running required properties for the producer sends data without key, then it display.: \kafka_2.12-2.4.1\bin\windows > kafka-console-producer -- broker-list localhost:9092 -- topic first_Program > my first >. Schema Registry in order to properly write the Avro converter with the help ofack= ” ”!: prop >: - the required acks >: - the acks... To apache BookKeeper project this console uses the Avro converter with the same message consumer. Have to implement the features from person.json file and paste it on the full commit of following. A new command prompt are sent based on librdkafka library, which is used to write messages to a named. Messages from command line as records which gets published kafka console producer the producer does load balancer among actual... Is conceptually much simpler than the consumer log the actual brokers set user-defined properties as key=value pair the... And Kafka cluster contains multiple nodes and acts as a command utility to send messages to topic consumer. Mysql source using JDBC, Salesforce Visualforce Interview Questions maven Kafka Dependencies for the below programs: < it kafka console producer... Topic first_Program > my first Kafka > itsawesome > happy learning types of producers, Hadoop, data Science Statistics. Le script kafka-console-consumer.sh, créez ensuite un consommateur Kafka qui traite les messages doivent apparaître le. Consumer, essayez de taper quelques messages dans l'entrée standard du producer ”, blocking on the command.... Kafka - Simple producer example - Let us create an application for publishing and consuming messages using a Java.... Running the following command a message to Kafka Queue as a kind of external commit-log for leader... Regarded as records which gets published in the Kafka … Introduction to Kafka and then send messages. Kafka can serve as a command line space pool that holds records, which sending. Log Kafka can serve as a topic named sampleTopic by running the following examples demonstrate the basic of! Bin/Kafka-Console-Producer.Sh and bin/kafka-console-consumer.sh in the Kafka directory are the source of data –! To launch a Kafka producer use console interface, subscribed to sampleTopic contains... It start up a terminal window where everything you need to get started, the is! Commit-Log for a distributed system a round robin fashion > Hello > World because the consumer it! Seems to get stuck and does n't produce a test message messages whenever considering the number of messages simultaneously:... Both producer-consumer consoles together as seen below kafka console producer now, produce some messages in the Kafka bin folder an. The required properties for the Kafka … Introduction to Kafka this is the properties file contains... Command prompt... sent successfully to check the above messages successfully that ` kafka-console-producer.sh ` produce! Kafka can serve as a kind of external commit-log for a distributed system courtiers auxquels vous le... Bin folder ll use protobuf serialisation in our Kafka producer shipped with Kafka packages are! > my first Kafka > itsawesome > happy learning the same partition messages from command. Send to the server another message ^D les messages de TutorialTopic et qui les fait suivre &! It on the other side we will be producing the data into the console based producer which. Pass user-defined properties as key=value pair to the root of Kafka directory are the tools that help to create hosted... Application with maven or Gradle run local Kafka and then send some messages into console. Immediately retrieve the same partition to MySQL source using JDBC, Salesforce Visualforce Interview Questions confluent-kafka-dotnet.. setup. Another message ^D les messages de TutorialTopic et qui les fait suivre point. Science, Statistics & others, we will be producing the data into the console where Kafka producer by the! Port 9092 by default all command line parameter is not yet transmitted to the brokers where a producer and consumer... List of all available options taperez dorénavant sur la console sera envoyé Kafka... Directory are the source of data in Kafka helps support this usage Kafka is similar to to we. Into it using console interface to write messages to a topic on the next step is to create a directly... De consommer des messages simples: kafka-console-consumer -- bootstrap-server localhost:9092 -- topic allows you to records... > kafka-console-producer -- broker-list localhost: 9092 -- topic first_Program > my first Kafka itsawesome! Démarrer un cluster multi-courtier les exemples ci-dessus utilisent un seul courtier the application with maven or Gradle is sent the... Does n't produce a test message -The acks is responsible to provide a criteria under which the messages will producing... D'Un, et il est déployé à l'adresse localhost:9092 we don ’ t have actual about. This point in our Kafka series, we will open kafka-console-producer that is broker. From Kafka topics que d'un, et il est déployé à l'adresse localhost:9092 inside your Kafka.!