I was following Pluralsight course "Getting Started with Apache Kafka", coming to module 4 about producing messages with Kafka producers. Then, I wanted to reproduce an author's demo which consists of creating and running an Apache Kafka producer application in Java. The idea is doing what it was done totally with shell program. Cluster setup: three partitions, three brokers, replication factor of 3. There is no global order across partitions. Logs, scripting code and/or properties are being avoided by verbosity reasons.
Since the beginning, the author establishes as prerequisites: Linux operating system, Java 8 JDK and Scala 2.11.x installed. The obtained Kafka version is 2.11-0.10.0.1, which is what exactly I'm using. As well, the tech stack is the same. With virtual machine, everything was going fine like exemplified in the course. I write messages from the producer and they are displayed by the consumer.
The barrier is coming to the aforementioned chapter. For a new demo, it's introduced and set up an Apache Kafka development environment, adding their own dependencies and browsing the API. So, more than already described prerequisites, there are meanwhile Maven and access to a test Kafka cluster. IntelliJ IDEA 2016.2.2 is adopted for the project with Java 1.8 SDK. Finally, the app is running with produced messages and the same are printed on the consumer console like as the previous demo.
What changes from my side? I'm using Windows 10 as host machine and IntelliJ IDEA 2023.1.5 for coding and execution. On the other hand, the remain modules run within guest machine defined by the following Vagrantfile, where third line has config.vm.network :forwarded_port, guest: 9092, host: 9092, id: "kafka" # Map host's port 9092 to guest's port 9092
also:
Vagrant.configure("2") do |config| config.vm.box = "hashicorp/bionic64" config.vm.network :forwarded_port, guest: 80, host: 8080, id: "http" # Map host's port 8080 to guest's port 80 config.vm.synced_folder "./practice", "/home/vagrant", type: "virtualbox"end
Despite of application outputting the same as SLF4J from my host side, no effect happens from my guest side meaning consumer still waiting. For the port 9092 scenario, there's a termination in less than one minute. That's the reproducible piece of code, where bootstrap.servers
was being set simply to localhost:9092
following the Vagrant variation:
import org.apache.kafka.clients.producer.KafkaProducer;import org.apache.kafka.clients.producer.ProducerRecord;import java.util.Properties;public class KafkaProducerApp { public static void main(String[] args) { // Create a properties dictionary for the required/optional Producer config settings: Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092,localhost:9093"); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); KafkaProducer<String, String> myProducer = new KafkaProducer<String, String>(props); try { for (int i = 0; i < 150; i++){ myProducer.send(new ProducerRecord<String, String>("my-topic", Integer.toString(i),"MyMessage: " + Integer.toString(i))); } } catch (Exception e) { e.printStackTrace(); } finally { myProducer.close(); } }}
What I'm doing wrongly to have no messages available in Kafka? What Vagrant configurations are missing to an effective communication between both operating systems? I still try modifications accordingly with suggested on this post but with no success. Despite of some useful explanations, WinRM is suitable communicator for an inverse scenario. Also, I hypothesized OpenSSL which could bring security risks related to executable source.
Thanks in advance for a response.