Connector return Struct instead of Avro

800 views
Skip to first unread message

Francesco Nobilia

unread,
Nov 29, 2016, 2:54:17 PM11/29/16
to Confluent Platform
Hi there,

I'm implementing a Sink Kafka Connector. To run it I use the standard connect-avro-standalone.properties /etc/schema-registry/connect-avro-standalone.properties.

Instead of consuming data in Avro, the put() function return sink record containing struct

ConnectRecord{ ... 
    key=org.apache.kafka.connect.data.Struct@d7d6f7c0,
    value=org.apache.kafka.connect.data.Struct@aba5070e
}

Therefore, when I try to cast key and value to the expected Avro objects I receive this execption
java.lang.ClassCastException: org.apache.kafka.connect.data.Struct cannot be cast to ...

The property files contains the correct converters:

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081

How can I obtain AVRO data instead of org.apache.kafka.connect.data.Struct?

Thank you in advance for your help 

Ewen Cheslack-Postava

unread,
Nov 29, 2016, 4:18:47 PM11/29/16
to Confluent Platform
This is expected -- connectors never work directly with a specific format. Connect defines a generic data API that can work with a number of different serialization formats. By doing so and separating out configuration of serialization format from the implementation of connectors, any connector is able to work with a variety of serialization formats (which makes them support many more users without any additional work on the part of the connector developer).

See the discussion of converters here: http://docs.confluent.io/3.1.1/connect/concepts.html#converters and you may also find the connector developer guide useful: http://docs.confluent.io/3.1.1/connect/devguide.html

You didn't mention what type of connector you are developing. Is there a reason you specifically need the Avro formatted data? If so, you can try to leverage the AvroConverter code (and the core of its implementation, AvroData), available here https://github.com/confluentinc/schema-registry/tree/master/avro-converter/src/main/java/io/confluent/connect/avro to help you perform that translation.

-Ewen

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platform+unsubscribe@googlegroups.com.
To post to this group, send email to confluent-platform@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/confluent-platform/813de6fb-3b4f-491c-877f-5d959c0a3dc4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Thanks,
Ewen

Francesco Nobilia

unread,
Nov 30, 2016, 4:25:38 AM11/30/16
to Confluent Platform
I'm developing a MongoDB Sink Connector that reads Avro topics.

Saying the above image, I expected to obtain key and value as Avro objects instead of structs. According to your reply, I have misunderstood the meaning of key.converter and value.converter params.


What I need is to convert my keys and values in bson Documents in order to store them in MongoDB. So far I have this solution that rises ClassCastException


public void put(Collection<SinkRecord> sinkRecords) {
for (SinkRecord record : sinkRecords) {
MyAvroKey myKey = (MyAvroKey) record.key();
MyAvroValue myValue = (MyAvroValue) record.value();
 
Document doc = convert(myKey,myValue); 
...  
}


On Tuesday, 29 November 2016 21:18:47 UTC, Ewen Cheslack-Postava wrote:
This is expected -- connectors never work directly with a specific format. Connect defines a generic data API that can work with a number of different serialization formats. By doing so and separating out configuration of serialization format from the implementation of connectors, any connector is able to work with a variety of serialization formats (which makes them support many more users without any additional work on the part of the connector developer).

See the discussion of converters here: http://docs.confluent.io/3.1.1/connect/concepts.html#converters and you may also find the connector developer guide useful: http://docs.confluent.io/3.1.1/connect/devguide.html

You didn't mention what type of connector you are developing. Is there a reason you specifically need the Avro formatted data? If so, you can try to leverage the AvroConverter code (and the core of its implementation, AvroData), available here https://github.com/confluentinc/schema-registry/tree/master/avro-converter/src/main/java/io/confluent/connect/avro to help you perform that translation.

-Ewen
On Tue, Nov 29, 2016 at 11:54 AM, Francesco Nobilia <f.no...@gmail.com> wrote:
Hi there,

I'm implementing a Sink Kafka Connector. To run it I use the standard connect-avro-standalone.properties /etc/schema-registry/connect-avro-standalone.properties.

Instead of consuming data in Avro, the put() function return sink record containing struct

ConnectRecord{ ... 
    key=org.apache.kafka.connect.data.Struct@d7d6f7c0,
    value=org.apache.kafka.connect.data.Struct@aba5070e
}

Therefore, when I try to cast key and value to the expected Avro objects I receive this execption
java.lang.ClassCastException: org.apache.kafka.connect.data.Struct cannot be cast to ...

The property files contains the correct converters:

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081

How can I obtain AVRO data instead of org.apache.kafka.connect.data.Struct?

Thank you in advance for your help 

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platform+unsub...@googlegroups.com.
To post to this group, send email to confluent...@googlegroups.com.



--
Thanks,
Ewen
Reply all
Reply to author
Forward
0 new messages