Integration of Tcollector with Kafka

293 views
Skip to first unread message

manoj sharma

unread,
May 18, 2016, 2:01:00 AM5/18/16
to OpenTSDB
Hi,

Can we provide the output of Tcollector to Kafka ?
If yes then please suggest in details.

we are working on a case study having the below design pattern.

Tcollector ----> Kafka ----> TSDB ----> Hbase

Thanks in advance.


Jonathan Creasy

unread,
May 18, 2016, 2:29:03 AM5/18/16
to manoj sharma, OpenTSDB
TCollector does not have that support, you could use Turnbeat or Logstash to accept the lines from TCollector, write them to Kafka.

What will you be using to read from Kafka and write to OpenTSDB?

What are you trying to accomplish/what problem are you trying to solve with this setup?

manoj sharma

unread,
May 18, 2016, 5:08:08 AM5/18/16
to OpenTSDB, putul...@gmail.com
Thanks Jonathan,

In my requirement, we are planning to build an event staging system where the data from different sources will be coming to Kafka,and we will filter out some of the data (tags and metrics) based on some logic and will send the filtered data to TSDB.

Your help is really appreciated.

Jonathan Creasy

unread,
May 18, 2016, 12:07:48 PM5/18/16
to manoj sharma, OpenTSDB

Ok, there are also some filtering capabilities added to OpenTSDB in the 2.3 branch. Let me look that up and send you some detail.

It sounds like you still have a step before this, some problem you are trying to solve that I might be able to help you with, that could possibly be solved with a less involved setup. What caused you decide to built an event staging system? (If you don't mind my digging, I don't mean to pry, only to help)

What are you using to read from Kafka?

-Jonathan

manoj sharma

unread,
May 19, 2016, 6:14:16 AM5/19/16
to OpenTSDB, putul...@gmail.com
Thanks Jonathan,

In a POC we are trying to build system workflow where:

1. we want to make Kafka as a staging system, which can hold all the row data coming from different sources.
2. we are planning to use 'Tcollector' along with Collectd and Zabbix to capture the system data residing in that cluster for analysis and monitoring purpose.
3. We are also planning to transform some of the data. For example if the data contains the Hostname of any system then we will rename the Hostname to some other meaningful name.

Then we are using customized Java api to read the data from Kafka and write it to OpenTSDB.

Regards 
Manoj Sharma

Lynch Lee

unread,
May 30, 2016, 8:06:49 AM5/30/16
to OpenTSDB, putul...@gmail.com
Hey guy , here is a plugin based OpenTSDB and Kafka, take a look maybe you can get something , good luck.  project: https://github.com/easemob/opentsdb-plugins

ManOLamancha

unread,
Dec 17, 2016, 7:32:35 PM12/17/16
to OpenTSDB, putul...@gmail.com
On Thursday, May 19, 2016 at 3:14:16 AM UTC-7, manoj sharma wrote:
1. we want to make Kafka as a staging system, which can hold all the row data coming from different sources.

This is how we use OpenTSDB at Yahoo!, with data flowing into Kafka, then Storm, then Kafka again then TSDB. Using Lynch's Kafka consumer you may be on the right track.
 
2. we are planning to use 'Tcollector' along with Collectd and Zabbix to capture the system data residing in that cluster for analysis and monitoring purpose.

+1 for a native Tcollector to Kafka writer.
 
3. We are also planning to transform some of the data. For example if the data contains the Hostname of any system then we will rename the Hostname to some other meaningful name.

We'd like to open source the Yahoo Storm code that can already do this. E.g. we look at "hostname" and add a "hostgroup" tag that can be used for additional filtering. If you want to write some code to do that it would be great to share. 
Reply all
Reply to author
Forward
0 new messages