Hi All,
I have schema registry up and running and set below configuration in connect-standalone.properties.
key.converter=io.confluent.connect.avro.AvroConverter
value.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter.schema.registry.url=http://localhost:8081
In my Source Task i created the Schema and struct for a simple schema like below
Schema conenctSchema = SchemaBuilder
.struct()
.name("test")
.field("name", org.apache.kafka.connect.data.Schema.OPTIONAL_STRING_SCHEMA)
.field("loc", org.apache.kafka.connect.data.Schema.OPTIONAL_STRING_SCHEMA)
.build();
Struct msgStruct = new Struct(conenctSchema);
msgStruct.put("name", "sreejith");
msgStruct.put("loc", "IN");
and passing the source record as
SourceRecord sourceRecord = new SourceRecord(null, null, "schema-test-new", conenctSchema,msgStruct);
My connector is running with out any exceptions , but when i consume it using kafka-avro-consumer its like
null {"name":{"string":"sreejith"},"loc":{"string":"IN"}}
also when i am consuming from normal kafka-console-consumer , its printing like this ,
Why the message is not converted in to Bytes ? Why its not in MAGIC_BYTE ID MSG IN BYTES format ? Am i missing something ?
Thank You
Sreejith