Max number of sinks or Kafka producers?

32 views
Skip to first unread message

Peter Ruijter

unread,
Jan 27, 2020, 8:54:28 AM1/27/20
to divolte-collector
Hi Divolte guys,  

We've been running a Divolte cluster for a while now that transports events via Kafka and Logstash to Elasticsearch. There are two different sinks set up that send the events to two different Kafka topics because the events have different content. We have now added a new sink which sent its messages to a different Kafka topic. The strange thing is just that it seems like the events don't arrive in the Kafka Topic. Does Divolte have a limit on the number of sinks or Kafka topics that can be set or that it can handle?Which I actually doubt, because even turning off an existing sink doesn't change the situation. We have already simplified the event very much to a single property object inside the second parameter of the divolte.signal function.

As you'll probably read between the lines, we've already tried to exclude quite a few things but we couldn't find the bug.  

If I listen with a Kafka console consumer on one of the other topics I see the specific message coming by, but no events are sent to the newly created topic. Do you have any idea where else to look? Because the results from Divolte's log don't help us either.

I really appreciate any helpful response.

Kind regards,

Peter Ruijter

Peter Ruijter

unread,
Jan 27, 2020, 9:07:07 AM1/27/20
to divolte-collector
One thing that pop's up in my mind. Divolte will put the single message in all sinks right or just one of them?

Kind regards,
Peter

Friso van Vollenhoven

unread,
Jan 27, 2020, 3:39:55 PM1/27/20
to divolte-...@googlegroups.com
Hi Peter,

There are no limits built into Divolte. Events move from source through mapping into sinks, so source -> mapping -> sink. All of these relations can be many to many, so you can indeed have multiple sinks behind one mapping, or multiple mappings with different sinks behind one source, etc. Without having a look at the configuration, I can't really say anything specific about your situation.

One thing to check is that the schemas match up in your configuration. I.e. that the new sink has a schema that can be correctly populated by your mapping.


Cheers,
Friso


--
You received this message because you are subscribed to the Google Groups "divolte-collector" group.
To unsubscribe from this group and stop receiving emails from it, send an email to divolte-collec...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/divolte-collector/ed531381-7b8c-440f-8cc7-00a77130cb75%40googlegroups.com.

Peter Ruijter

unread,
Jan 28, 2020, 7:27:49 AM1/28/20
to divolte-collector
Hi Friso,

Thanks for your quick reply. 

The events for the newly added sink are coming trough now. I think because we run Divolte in cluster, that was the cause of the problem. For the ease of editing and testing and not have to wait for every deployment cycle, I thought it was easier to edit the files on one instance and only send requests to that instance. Looking back on this adventure, this was not the right choice, to put it mildly. 

Let other people learn from this, too. Change all Divolte configurations on all instances set in kafka.producer, otherwise you're screwed.

Kind regards,

Peter
To unsubscribe from this group and stop receiving emails from it, send an email to divolte-...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages