When following the "Test Drive Protobuf Schema" section things seem to work. But my use case is a bit different. I'm generating the topic records by manually calling message.ToByteArray() and sending that byte to the Kafka topic using Apache Beams's KafkaIO.Write methods (in Java).
I know the produced mesages are good, as I can deserialize them in another app (in Go) given that I'm using the same .proto file to generate the respective classes in both apps.
I'm using the same .proto file as the topic's schema, but everytime I either run a sink with the ProtobufConverter set up or when I run the kafka-protobuf-console-consumer, I get a "Unknown Magic byte!" exception.
I only got 1 message in my topic, so there's no issue that some are good, some are bad.
I've noticed the example in the guide I am following inputs data as JSON. Is it expected for the Confluent Protobuf Converter for the data to be in that format? My understanding is that the kafka-protobuf-console-producer converts that message to bytes before it's put on the Kafka topic. But when I look at the topic with the Confluent Control Panel, I see the message as JSON. When I look at my other topic where I put my manually serialized message, I see it as bytes (ie: there are no curly brakets in the message, just field values).
Or maybe tips on how to try to debug this?