Does it make sense to run multiple Kafka Producer in a vert.x application

6 views
Skip to first unread message

Sachin Mittal

unread,
Dec 11, 2025, 10:47:20 AM (4 days ago) Dec 11
to vert.x
Hi,
I am running a cluster vert.x web application running on 8 vCPU and 16 GB RAM VM.

The application receives payloads as POST requests and puts it on event bus and on the other end a consumer reads those payloads and pushes it to Kafka.

Each application is configured as:

VertxOptions options = context.vertxOptions();
options.setEventLoopPoolSize(16).setWorkerPoolSize(80);

DeploymentOptions options = context.deploymentOptions();
options.setInstances(16);

My Kafka producer is configured as:
KafkaProducer<String, String> producer KafkaProducer.createShared(vertx, "the-producer", config);

vertx.eventBus().consumer("platform"message -> {
  String topic = message.headers().get("topic");
  String key = message.headers().get("key");
  KafkaProducerRecord<String, String> record =
  KafkaProducerRecord.create(topic, key, message.body());
  producer.send(record).onSuccess(...).onFailure(...);
})
);

As you can see I am creating a shared producer, which means there will be a single producer for one deployed application, so does it make sense to create multiple producers in order to increase the throughput of the application ?

Thanks
Sachin

Reply all
Reply to author
Forward
0 new messages