downgrading to Kafka 0.9

224 views
Skip to first unread message

Imran Akbar

unread,
Sep 20, 2016, 5:19:52 PM9/20/16
to Confluent Platform
Hi,

I had installed Confluent 3, but an application I'm using only supports Kafka 0.9.
I've uninstalled the Confluent 3 packages, and installed the Confluent 2 packages, but I'm still getting the following error when trying to start the schema registry:

[2016-09-20 21:16:14,688] WARN Creating the schema topic _schemas using a replication factor of 1, which is less than the desired one of 3. If this is a production environment, it's crucial to add more brokers and increase the replication factor of the topic. (io.confluent.kafka.schemaregistry.storage.KafkaStore:229)

[2016-09-20 21:16:15,040] ERROR Server died unexpectedly:  (io.confluent.kafka.schemaregistry.rest.SchemaRegistryMain:51)

org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topic_metadata': Error reading array of size 548723, only 36 bytes available

        at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:73)

        at org.apache.kafka.clients.NetworkClient.parseResponse(NetworkClient.java:380)

        at org.apache.kafka.clients.NetworkClient.handleCompletedReceives(NetworkClient.java:449)

        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:269)

        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:360)

        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:224)

        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:178)

        at org.apache.kafka.clients.consumer.internals.Fetcher.getTopicMetadata(Fetcher.java:220)

        at org.apache.kafka.clients.consumer.KafkaConsumer.partitionsFor(KafkaConsumer.java:1270)

        at io.confluent.kafka.schemaregistry.storage.KafkaStoreReaderThread.<init>(KafkaStoreReaderThread.java:107)

        at io.confluent.kafka.schemaregistry.storage.KafkaStore.init(KafkaStore.java:154)

        at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.init(KafkaSchemaRegistry.java:187)

        at io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication.setupResources(SchemaRegistryRestApplication.java:55)

        at io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication.setupResources(SchemaRegistryRestApplication.java:37)

        at io.confluent.rest.Application.createServer(Application.java:118)

        at io.confluent.kafka.schemaregistry.rest.SchemaRegistryMain.main(SchemaRegistryMain.java:43)


How do I fix this?


thanks,

imran

Imran Akbar

unread,
Sep 20, 2016, 6:08:16 PM9/20/16
to Confluent Platform
I think the problem is the version that gets installed.

I remove all the confluent packages, and then follow these steps: http://docs.confluent.io/2.0.0/installation.html#installation-apt

but it keeps installing the Confluent 3 packages even though I told it to sudo apt-get update && sudo apt-get install confluent-platform-2.11.7

Any ideas?

thanks

Imran Akbar

unread,
Sep 20, 2016, 6:20:29 PM9/20/16
to Confluent Platform
Managed to figure it out - had to remove the Confluent 3 line from my /etc/apt/sources.list and then remove all the confluent packages, and install them again.

Atharva Inamdar

unread,
Sep 21, 2016, 4:04:48 PM9/21/16
to Confluent Platform
The issue is that protocol and message format has changed between Kafka 0.9 and 0.10. Downgrading is thus difficult for brokers. a 0.9 client is still compatible with Kafka 0.10. 

Reply all
Reply to author
Forward
0 new messages