Schema change topic is not being created

19 views
Skip to first unread message

Zach Walker

unread,
Apr 27, 2021, 8:53:20 PMApr 27
to debezium
Hi, 

I am running kafka 2.7.0 via MSK with auto topic creation disabled.
I am then running kafka connect and debezium using the confluent operator helm chart on k8s.

When starting up debezium, the connector task gets stuck during initial snapshoting while trying to find what I believe is the schema change topic named the same as the database.server.name.  The log messages look as follows:

[2021-04-28 00:40:25,421] WARN [Producer clientId=connector-producer-qa422-debezium-0] Error while fetching metadata with correlation id 42373 : {dbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient)

When I list kafka topics, I see that the database history topic has been created, so I believe that kafka connect side topic creation is working at least to some degree.

I'm struggling to figure out why the schema change topic is not created.  Any suggestions?

Zach Walker

unread,
Apr 27, 2021, 9:24:26 PMApr 27
to debezium
Some additional info.

This is the connect worker properties file conents

internal.key.converter.schemas.enable=false
group.id=event-stream
value.converter.schemas.enable=false
status.storage.replication.factor=3
key.converter=io.confluent.connect.avro.AvroConverter
config.storage.topic=event-stream-cp-kafka-connect-config
offset.storage.replication.factor=3
plugin.path=/usr/share/java,/usr/share/confluent-hub-components
offset.storage.topic=event-stream-cp-kafka-connect-offset
key.converter.schemas.enable=false
bootstrap.servers=PLAINTEXT://kafka.event-stream:9092
value.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://event-stream-cp-schema-registry:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
rest.port=8083
status.storage.topic=event-stream-cp-kafka-connect-status
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schema.registry.url=http://event-stream-cp-schema-registry:8081
internal.value.converter.schemas.enable=false
config.storage.replication.factor=3

This is the debezium connector config:

      "connector.class": "io.debezium.connector.mysql.MySqlConnector",
      "tasks.max": 1,
      "database.hostname": "<host>",
      "database.port": 3306,
      "database.user": "<user>",
      "database.password": "<password>",
      "database.serverTimezone": "UTC",
      "database.server.id": 777777,
      "database.server.name": "dbserver",
      "database.include.list": ".*dbz",
      "database.history.kafka.bootstrap.servers": "kafka.event-stream:9092",
      "database.history.kafka.topic": "dbserver-schema-change",
      "transforms": "Reroute",
      "transforms.Reroute.type": "io.debezium.transforms.ByLogicalTableRouter",
      "transforms.Reroute.topic.regex": "(.*)\\.(.*)_(.*)\\.(.*)",
      "transforms.Reroute.topic.replacement": "$2.$4",
      "transforms.Reroute.key.field.regex": "(.*)\\.(.*)_(.*)\\.(.*)",
      "transforms.Reroute.key.field.replacement": "$3"

Zach Walker

unread,
Apr 28, 2021, 1:09:22 AMApr 28
to debezium
Adding the following to the debezium connector setup seems to have fixed it

"topic.creation.enabled": "true",
 "topic.creation.default.replication.factor": -1,
"topic.creation.default.partitions": 1,

I am confused by this for a few reasons:
1. I though topic.creation.enabled was true by default
2. I tried setting these values at worker level and that didn't seem to work where setting them at the connector level does seem to work

Gunnar Morling

unread,
Apr 28, 2021, 2:48:13 AMApr 28
to debezium
Hi,

The first message you shared is a warning only, not an error. Was the connector actually in "failed" state? If so, you should have seen an exception in the logs and connector status. Re "topic.creation.enabled", this doesn't impact the history topic, as that's not created through Kafka Connect by a separate admin/producer client from within the connector.

--Gunnar

Zach Walker

unread,
Apr 28, 2021, 1:30:50 PMApr 28
to debezium
No, the worker and task both remained in "RUNNING" state and there was a continuous stream of the same warning message.

My understanding is that there is a history topic that is different from the schema change topic and that the history topic is special to debezium and the schema change topic is for external consumers.
I believe it was the schema change topic that was not being created and resulting in the infinite loop of attempting to fetch metadata.

Does it make sense that I would need to configure topic auto creation at the debezium connector level rather than at the kafka connect worker level?

jiri.p...@gmail.com

unread,
May 5, 2021, 7:43:37 AMMay 5
to debezium
Hi,

with secured Kafka you need toconfigure both database history passthrough config options at connector level and Kafka producer/consumer proeprties at worker level.

J.

Reply all
Reply to author
Forward
0 new messages