I know Kafka Streams is still in development, but I was hoping to try
it out, and there’s a lot of green checkmarks in KAFKA-2590 so I
thought it might be worth a try.
Unfortunately, when my processor’s consumer tries to deserialize a
record using the KafkaAvroDeserializer, I’m getting this:
Exception in thread "StreamThread-1"
org.apache.kafka.common.errors.SerializationException:
Error deserializing Avro message for id 1
Caused by: java.lang.NullPointerException
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:120)
I’m using kafka-avro-serializer 2.0.0, so it looks like the errant line is:
Schema schema = schemaRegistry.getByID(id);
at
https://github.com/confluentinc/schema-registry/blob/2.x/avro-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaAvroDeserializer.java#L120
so it looks as though `schemaRegistry` is null, which makes it seem as
though the `configure` method was never called. I did look through the
StreamThread and KafkaConsumer code though, so it looks as though it
*should* be called.
I know Kafka Streams is still in flux, and I certainly don’t expect
anyone from Confluent to support it at this point. But I’m curious,
and I thought I’d post here just in case someone feels like
enlightening me.
Thank you!
Avi