How can I use a MySql case-sensitive column with a debezium

144 views
Skip to first unread message

shani.kim

unread,
Nov 16, 2020, 2:32:44 AM11/16/20
to debezium

Hello,

I have a problem using MySqlConnector.

Case-sensitive was required for a particular column.

So I changed the column by usin below sql. (set binary option to change collation )



ALTER TABLE `test` CHANGE `case_insensitive` `case_sensitive` VARCHAR(10)  CHARACTER SET utf8mb4  BINARY  NULL  DEFAULT NULL;


And then, spitting out this log, the connector stopped.


trace: "org.apache.kafka.connect.errors.ConnectException: mismatched input 'BINARY' expecting {<EOF>, '--'} at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230) at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:604) at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1100) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:951) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:594) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:838) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: io.debezium.text.ParsingException: mismatched input 'BINARY' expecting {<EOF>, '--'} at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:40) at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41) at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544) at org.antlr.v4.runtime.DefaultErrorStrategy.reportInputMismatch(DefaultErrorStrategy.java:327) at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:139) at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:902) at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:68) at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:41) at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:80) at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:326) at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:807) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:587) ... 5 more Caused by: org.antlr.v4.runtime.InputMismatchException at org.antlr.v4.runtime.DefaultErrorStrategy.sync(DefaultErrorStrategy.java:270) at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:887) ... 11 more "



It is asumed that the ddl parser of debezium failed to process the column's collation change.

My connector just died.

The settings of the column have been restored to their original state, but the connector is stopped.

What should I do in this case?

And how can I use a case-sensitive column with a debezium?


For your information, I attach my debezium MySqlConnector settings.



{

  "name": "test-connector",

  "config": {

    "connector.class": "io.debezium.connector.mysql.MySqlConnector",

    "tasks.max": "1",

    "database.hostname": "...",

    "database.port": "13306",

    "database.user": "...",

    "database.password": "...",

    "database.server.id": "5555",

    "database.server.name": "mysql_dev",

    "database.include.list": "my_db",

    "table.include.list": "my_db.my_table",

    "tombstones.on.delete": false,

    "database.history.kafka.bootstrap.servers": "...",

    "database.history.kafka.topic": "schema.changes.my_db",

    "skipped.operations": "d",

    "value.converter": "io.debezium.converters.CloudEventsConverter",

    "message.key.columns": "my_db.my_table:aggregate_id"

  }

}


jiri.p...@gmail.com

unread,
Nov 16, 2020, 2:44:14 AM11/16/20
to debezium
Hi,

the real error is most probably BINARY  NULL. Could you please point me the docs about this MySQL statement so we would update the grammar?

Thanks a lot

J.

shani.kim

unread,
Nov 16, 2020, 11:01:02 PM11/16/20
to debezium
Thank you! J.

I've tested it, and I think there's a problem with parsing the 'Binary attribute'.

docs about this MySQL statement is here.
https://dev.mysql.com/doc/refman/8.0/en/binary-varbinary.html

The BINARY and VARBINARY data types are distinct from the CHAR BINARY and VARCHAR BINARY data types. For the latter types, the BINARY attribute does not cause the column to be treated as a binary string column. Instead, it causes the binary (_bin) collation for the column character set (or the table default character set if no column character set is specified) to be used, and the column itself stores nonbinary character strings rather than binary byte strings. For example, if the default character set is utf8mb4, CHAR(5) BINARY is treated as CHAR(5) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin. This differs from BINARY(5), which stores 5-byte binary strings that have the binary character set and collation. For information about the differences between the binary collation of the binary character set and the _bin collations of nonbinary character sets, see Section 10.8.5, “The binary Collation Compared to _bin Collations”.

The connector worked well when I used this query.
ALTER TABLE `test` CHANGE `id` `id` VARCHAR(10)  CHARACTER SET utf8mb4  COLLATE utf8mb4_bin NOT NULL;

But this query had the same problem as before.
ALTER TABLE `test` CHANGE `id` `id` VARCHAR(10)  CHARACTER SET utf8mb4  BINARY  NOT NULL;


Actually, I didn't mean to use a query with a Binary attribute.
It was an automatically generated query when the column was modified in the UI of the SequelPro. (https://www.sequelpro.com/)

I'd appreciate it if you could fix it so that Debezium could parse the Binary attribute.

---

I have an additional question.
What should I do if the DDL parsing fails like this case?

In this case, to get the connector back to normal status,
I deleted Offset topic and recreated it.
The connector took a new snapshot, and it worked again.

Is this the best way?
Do you have any recommendations for restoring connectors?

Regards,
shani.kim
2020년 11월 16일 월요일 오후 4시 44분 14초 UTC+9에 jiri.p...@gmail.com님이 작성:

jiri.p...@gmail.com

unread,
Nov 18, 2020, 4:14:32 AM11/18/20
to debezium
Hi,

thats for the report the fix is available at https://issues.redhat.com/browse/DBZ-2771

Yes, the procedure you've followed is correct. other option is just removing the database history topic and using schme_only_snapshot.

Then it is not necessary to reexecute data snapshot.

J.

shani.kim

unread,
Nov 20, 2020, 3:22:33 AM11/20/20
to debezium
Hi, J.

Thank you so much for fixing this problem quickly.
I look forward to version 1.4!

I have a question about recovery procedure.
When i used first solution that remove offset topic and restart connector, the connector make snapshot.
What concerns me is to publish duplicate events.

So, i like second solution you told me.
  >other option is just removing the database history topic and using schme_only_snapshot.

But, this solution doesn't work well.
Did you mean using "snapshot.mode": "schema_only" option, when you said 'schme_only_snapshot'?

I did these steps.
1. remove db history topic
2. create db new history topic
3. change connector config

curl -X PUT -H "Content-Type: application/json" --data '{
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "snapshot.mode": "schema_only",
    "tasks.max": "1",
    "database.hostname": "...",
    "database.port": "...",
    "database.user": "...",
    "database.password": "...",
    "database.server.id": "5555",
    "database.server.name": "mysql_dev",
    "database.include.list": "test",
    "table.include.list": "test_outbox",
    "tombstones.on.delete": false,
    "database.history.kafka.bootstrap.servers": "...",
    "database.history.kafka.topic": "...",
    "skipped.operations": "d",
    "value.converter": "io.debezium.converters.CloudEventsConverter",
    "message.key.columns": "test.test_outbox:aggregate_id"


I tested it, and there was an error like this.
스크린샷 2020-11-20 오후 4.33.13.png



and then, i also tested "snapshot.mode": "schema_only_recovery".
That's a failure, too..

Is there any problem with my recovery test?

Regards,
shani.kim
2020년 11월 18일 수요일 오후 6시 14분 32초 UTC+9에 jiri.p...@gmail.com님이 작성:

jiri.p...@gmail.com

unread,
Nov 23, 2020, 3:26:09 AM11/23/20
to debezium
Hi,

I am sorry for the confusion I meant schema_only_recovery snapshot mode - just somehow I mentally melted the words together. What error do you see with schema_only_recovery mode? In this case you need to keep the offsets but remove the history.

J.

shani.kim

unread,
Nov 24, 2020, 3:11:14 AM11/24/20
to debezium
Hi, 
Thank you for your continued care.

I've tried twice.

#. First try:
1. remove db history topic
2. create db new history topic
3. change connector config

curl -X PUT -H "Content-Type: application/json" --data '{
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "snapshot.mode": "schema_only_recovery",
    "tasks.max": "1",
    "database.hostname": "...",
    "database.port": "...",
    "database.user": "...",
    "database.password": "...",
    "database.server.id": "5555",
    "database.server.name": "mysql_dev",
    "database.include.list": "test",
    "table.include.list": "test_outbox",
    "tombstones.on.delete": false,
    "database.history.kafka.bootstrap.servers": "...",
    "database.history.kafka.topic": "...",
    "skipped.operations": "d",
    "value.converter": "io.debezium.converters.CloudEventsConverter",
    "message.key.columns": "test.test_outbox:aggregate_id"


4. The problem still exists. so I restarted the connector. (But it didn't work.)

- Error log:
2020-11-24 14:41:34,208 ERROR  MySQL|mysql_dev|binlog  Failed due to error: Error processing binlog event   [io.debezium.connector.mysql.BinlogReader]
org.apache.kafka.connect.errors.ConnectException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)
at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:604)
at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1100)
at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:951)
at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:594)
at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:838)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.debezium.text.ParsingException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:40)
at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)
at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportInputMismatch(DefaultErrorStrategy.java:327)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:139)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:902)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:68)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:41)
at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:80)
at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:326)
at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:807)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:587)
... 5 more
Caused by: org.antlr.v4.runtime.InputMismatchException
at org.antlr.v4.runtime.DefaultErrorStrategy.sync(DefaultErrorStrategy.java:270)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:887)
... 11 more
2020-11-24 14:41:34,209 INFO   MySQL|mysql_dev|binlog  Error processing binlog event, and propagating to Kafka Connect so it stops this connector. Future binlog events read before connector is shutdown will be ignored.   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:41:34,297 INFO   MySQL|mysql_dev|task  Keepalive thread is running   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:41:34,397 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Committing offsets   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:41:34,397 INFO   ||  WorkerSourceTask{id=shopper-connector-0} flushing 0 outstanding messages for offset commit   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:41:34,401 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Finished commitOffsets successfully in 4 ms   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:41:34,401 ERROR  ||  WorkerSourceTask{id=shopper-connector-0} Task threw an uncaught and unrecoverable exception   [org.apache.kafka.connect.runtime.WorkerTask]
org.apache.kafka.connect.errors.ConnectException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)
at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:604)
at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1100)
at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:951)
at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:594)
at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:838)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.debezium.text.ParsingException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:40)
at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)
at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportInputMismatch(DefaultErrorStrategy.java:327)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:139)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:902)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:68)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:41)
at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:80)
at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:326)
at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:807)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:587)
... 5 more
Caused by: org.antlr.v4.runtime.InputMismatchException
at org.antlr.v4.runtime.DefaultErrorStrategy.sync(DefaultErrorStrategy.java:270)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:887)
... 11 more
2020-11-24 14:41:34,402 ERROR  ||  WorkerSourceTask{id=shopper-connector-0} Task is being killed and will not recover until manually restarted   [org.apache.kafka.connect.runtime.WorkerTask]
2020-11-24 14:41:34,402 INFO   ||  Stopping down connector   [io.debezium.connector.common.BaseSourceTask]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|task  Stopping MySQL connector task   [io.debezium.connector.mysql.MySqlConnectorTask]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|task  ChainedReader: Stopping the binlog reader   [io.debezium.connector.mysql.ChainedReader]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|task  Discarding 1 unsent record(s) due to the connector shutting down   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|task  Discarding 0 unsent record(s) due to the connector shutting down   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|binlog  Stopped reading binlog after 0 events, no new offset was recorded   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:41:34,402 INFO   MySQL|mysql_dev|task  [Producer clientId=shopper-connector-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.   [org.apache.kafka.clients.producer.KafkaProducer]
2020-11-24 14:41:34,404 INFO   MySQL|mysql_dev|task  Connector task finished all work and is now shutdown   [io.debezium.connector.mysql.MySqlConnectorTask]
2020-11-24 14:41:34,404 INFO   ||  [Producer clientId=connector-producer-shopper-connector-0] Closing the Kafka producer with timeoutMillis = 30000 ms.   [org.apache.kafka.clients.producer.KafkaProducer]
2020-11-24 14:42:15,895 INFO   ||  Stopping connector shopper-connector   [org.apache.kafka.connect.runtime.Worker]
2020-11-24 14:42:15,896 INFO   ||  Scheduled shutdown for WorkerConnector{id=shopper-connector}   [org.apache.kafka.connect.runtime.WorkerConnector]
2020-11-24 14:42:15,897 INFO   ||  Completed shutdown for WorkerConnector{id=shopper-connector}   [org.apache.kafka.connect.runtime.WorkerConnector]
2020-11-24 14:42:15,898 INFO   ||  [Worker clientId=connect-1, groupId=gncp-shopper-connector] Starting connector shopper-connector   [org.apache.kafka.connect.runtime.distributed.DistributedHerder]
2020-11-24 14:42:15,899 INFO   ||  Creating connector shopper-connector of type io.debezium.connector.mysql.MySqlConnector   [org.apache.kafka.connect.runtime.Worker]
2020-11-24 14:42:15,900 INFO   ||  SourceConnectorConfig values: 
config.action.reload = restart
connector.class = io.debezium.connector.mysql.MySqlConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = shopper-connector
predicates = []
tasks.max = 1
topic.creation.groups = []
transforms = []
value.converter = class io.debezium.converters.CloudEventsConverter
   [org.apache.kafka.connect.runtime.SourceConnectorConfig]
2020-11-24 14:42:15,900 INFO   ||  EnrichedConnectorConfig values: 
config.action.reload = restart
connector.class = io.debezium.connector.mysql.MySqlConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = shopper-connector
predicates = []
tasks.max = 1
topic.creation.groups = []
transforms = []
value.converter = class io.debezium.converters.CloudEventsConverter
   [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig]
2020-11-24 14:42:15,900 INFO   ||  Instantiated connector shopper-connector with version 1.3.0.Final of type class io.debezium.connector.mysql.MySqlConnector   [org.apache.kafka.connect.runtime.Worker]
2020-11-24 14:42:15,900 INFO   ||  Finished creating connector shopper-connector   [org.apache.kafka.connect.runtime.Worker]
2020-11-24 14:42:15,901 INFO   ||  SourceConnectorConfig values: 
config.action.reload = restart
connector.class = io.debezium.connector.mysql.MySqlConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = shopper-connector
predicates = []
tasks.max = 1
topic.creation.groups = []
transforms = []
value.converter = class io.debezium.converters.CloudEventsConverter
   [org.apache.kafka.connect.runtime.SourceConnectorConfig]
2020-11-24 14:42:15,901 INFO   ||  EnrichedConnectorConfig values: 
config.action.reload = restart
connector.class = io.debezium.connector.mysql.MySqlConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = shopper-connector
predicates = []
tasks.max = 1
topic.creation.groups = []
transforms = []
value.converter = class io.debezium.converters.CloudEventsConverter
   [org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig]
2020-11-24 14:42:31,802 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Committing offsets   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:42:31,804 INFO   ||  WorkerSourceTask{id=shopper-connector-0} flushing 0 outstanding messages for offset commit   [org.apache.kafka.connect.runtime.WorkerSourceTask]



#. Second try:
1. remove db history topic
2. create db new history topic
3. change connector config

curl -X PUT -H "Content-Type: application/json" --data '{
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "snapshot.mode": "schema_only_recovery",
    "tasks.max": "1",
    "database.hostname": "...",
    "database.port": "...",
    "database.user": "...",
    "database.password": "...",
    "database.server.id": "5555",
    "database.server.name": "mysql_dev",
    "database.include.list": "test",
    "table.include.list": "test_outbox",
    "tombstones.on.delete": false,
    "database.history.kafka.bootstrap.servers": "...",
    "database.history.kafka.topic": "...",
    "skipped.operations": "d",
    "value.converter": "io.debezium.converters.CloudEventsConverter",
    "message.key.columns": "test.test_outbox:aggregate_id"

4. restart kafka connect (However, only the log below appears and connector still does not work.)

- Error log:
2020-11-24 14:48:03,831 INFO   MySQL|mysql_dev|snapshot  Step 8: encountered only schema based snapshot, skipping data snapshot   [io.debezium.connector.mysql.SnapshotReader]
2020-11-24 14:48:03,831 INFO   MySQL|mysql_dev|snapshot  Step 9: committing transaction   [io.debezium.connector.mysql.SnapshotReader]
2020-11-24 14:48:03,834 INFO   MySQL|mysql_dev|snapshot  Completed snapshot in 00:00:02.67   [io.debezium.connector.mysql.SnapshotReader]
2020-11-24 14:48:04,137 INFO   MySQL|mysql_dev|task  Transitioning from the snapshot reader to the binlog reader   [io.debezium.connector.mysql.ChainedReader]
2020-11-24 14:48:04,172 INFO   MySQL|mysql_dev|task  GTID set purged on server:    [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,172 INFO   MySQL|mysql_dev|task  Attempting to generate a filtered GTID set   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,172 INFO   MySQL|mysql_dev|task  GTID set from previous recorded offset: 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3341   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,173 INFO   MySQL|mysql_dev|task  GTID set available on server: 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3358   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,173 INFO   MySQL|mysql_dev|task  Using first available positions for new GTID channels   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,173 INFO   MySQL|mysql_dev|task  Relevant GTID set available on server: 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3358   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,174 INFO   MySQL|mysql_dev|task  Final merged GTID set to use when connecting to MySQL: 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3341   [io.debezium.connector.mysql.MySqlTaskContext]
2020-11-24 14:48:04,174 INFO   MySQL|mysql_dev|task  Registering binlog reader with GTID set: 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3341   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,175 INFO   MySQL|mysql_dev|task  Creating thread debezium-mysqlconnector-mysql_dev-binlog-client   [io.debezium.util.Threads]
2020-11-24 14:48:04,181 INFO   MySQL|mysql_dev|task  Creating thread debezium-mysqlconnector-mysql_dev-binlog-client   [io.debezium.util.Threads]
Nov 24, 2020 2:48:04 PM com.github.shyiko.mysql.binlog.BinaryLogClient connect
INFO: Connected to 10.113.253.82:13306 at 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3341 (sid:5555, cid:252376)
2020-11-24 14:48:04,209 INFO   MySQL|mysql_dev|binlog  Connected to MySQL binlog at 10.113.253.82:13306, starting at GTIDs 5a52c9c6-ef0e-11e9-b4a3-a2126025ed83:1-3341 and binlog file 'mysql-bin.000020', pos=3814680, skipping 0 events plus 0 rows   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,209 INFO   MySQL|mysql_dev|task  Waiting for keepalive thread to start   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,209 INFO   MySQL|mysql_dev|binlog  Creating thread debezium-mysqlconnector-mysql_dev-binlog-client   [io.debezium.util.Threads]
2020-11-24 14:48:04,235 ERROR  MySQL|mysql_dev|binlog  Error during binlog processing. Last offset stored = null, binlog reader near position = mysql-bin.000020/3814745   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,236 ERROR  MySQL|mysql_dev|binlog  Failed due to error: Error processing binlog event   [io.debezium.connector.mysql.BinlogReader]
org.apache.kafka.connect.errors.ConnectException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)
at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:604)
at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1100)
at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:951)
at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:594)
at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:838)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.debezium.text.ParsingException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:40)
at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)
at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportInputMismatch(DefaultErrorStrategy.java:327)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:139)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:902)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:68)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:41)
at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:80)
at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:326)
at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:807)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:587)
... 5 more
Caused by: org.antlr.v4.runtime.InputMismatchException
at org.antlr.v4.runtime.DefaultErrorStrategy.sync(DefaultErrorStrategy.java:270)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:887)
... 11 more
2020-11-24 14:48:04,239 INFO   MySQL|mysql_dev|binlog  Error processing binlog event, and propagating to Kafka Connect so it stops this connector. Future binlog events read before connector is shutdown will be ignored.   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,309 INFO   MySQL|mysql_dev|task  Keepalive thread is running   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,409 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Committing offsets   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:48:04,409 INFO   ||  WorkerSourceTask{id=shopper-connector-0} flushing 0 outstanding messages for offset commit   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:48:04,421 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Finished commitOffsets successfully in 12 ms   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:48:04,421 ERROR  ||  WorkerSourceTask{id=shopper-connector-0} Task threw an uncaught and unrecoverable exception   [org.apache.kafka.connect.runtime.WorkerTask]
org.apache.kafka.connect.errors.ConnectException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)
at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:207)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:604)
at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1100)
at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:951)
at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:594)
at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:838)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.debezium.text.ParsingException: mismatched input 'BINARY' expecting {, '--'}
at io.debezium.antlr.ParsingErrorListener.syntaxError(ParsingErrorListener.java:40)
at org.antlr.v4.runtime.ProxyErrorListener.syntaxError(ProxyErrorListener.java:41)
at org.antlr.v4.runtime.Parser.notifyErrorListeners(Parser.java:544)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportInputMismatch(DefaultErrorStrategy.java:327)
at org.antlr.v4.runtime.DefaultErrorStrategy.reportError(DefaultErrorStrategy.java:139)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:902)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:68)
at io.debezium.connector.mysql.antlr.MySqlAntlrDdlParser.parseTree(MySqlAntlrDdlParser.java:41)
at io.debezium.antlr.AntlrDdlParser.parse(AntlrDdlParser.java:80)
at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:326)
at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:807)
at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:587)
... 5 more
Caused by: org.antlr.v4.runtime.InputMismatchException
at org.antlr.v4.runtime.DefaultErrorStrategy.sync(DefaultErrorStrategy.java:270)
at io.debezium.ddl.parser.mysql.generated.MySqlParser.root(MySqlParser.java:887)
... 11 more
2020-11-24 14:48:04,421 ERROR  ||  WorkerSourceTask{id=shopper-connector-0} Task is being killed and will not recover until manually restarted   [org.apache.kafka.connect.runtime.WorkerTask]
2020-11-24 14:48:04,421 INFO   ||  Stopping down connector   [io.debezium.connector.common.BaseSourceTask]
2020-11-24 14:48:04,421 INFO   MySQL|mysql_dev|task  Stopping MySQL connector task   [io.debezium.connector.mysql.MySqlConnectorTask]
2020-11-24 14:48:04,421 INFO   MySQL|mysql_dev|task  ChainedReader: Stopping the binlog reader   [io.debezium.connector.mysql.ChainedReader]
2020-11-24 14:48:04,421 INFO   MySQL|mysql_dev|task  Discarding 1 unsent record(s) due to the connector shutting down   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,422 INFO   MySQL|mysql_dev|task  Discarding 0 unsent record(s) due to the connector shutting down   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,422 INFO   MySQL|mysql_dev|task  [Producer clientId=shopper-connector-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.   [org.apache.kafka.clients.producer.KafkaProducer]
2020-11-24 14:48:04,422 INFO   MySQL|mysql_dev|binlog  Stopped reading binlog after 0 events, no new offset was recorded   [io.debezium.connector.mysql.BinlogReader]
2020-11-24 14:48:04,424 INFO   MySQL|mysql_dev|task  Connector task finished all work and is now shutdown   [io.debezium.connector.mysql.MySqlConnectorTask]
2020-11-24 14:48:04,425 INFO   ||  [Producer clientId=connector-producer-shopper-connector-0] Closing the Kafka producer with timeoutMillis = 30000 ms.   [org.apache.kafka.clients.producer.KafkaProducer]
2020-11-24 14:49:00,477 INFO   ||  WorkerSourceTask{id=shopper-connector-0} Committing offsets   [org.apache.kafka.connect.runtime.WorkerSourceTask]
2020-11-24 14:49:00,477 INFO   ||  WorkerSourceTask{id=shopper-connector-0} flushing 0 outstanding messages for offset commit   [org.apache.kafka.connect.runtime.WorkerSourceTask]


After checking the error log, I changed the config to remove the snapshot mode.

curl -X PUT -H "Content-Type: application/json" --data '{
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "tasks.max": "1",
    "database.hostname": "...",
    "database.port": "...",
    "database.user": "...",
    "database.password": "...",
    "database.server.id": "5555",
    "database.server.name": "mysql_dev",
    "database.include.list": "test",
    "table.include.list": "test_outbox",
    "tombstones.on.delete": false,
    "database.history.kafka.bootstrap.servers": "...",
    "database.history.kafka.topic": "...",
    "skipped.operations": "d",
    "value.converter": "io.debezium.converters.CloudEventsConverter",
    "message.key.columns": "test.test_outbox:aggregate_id"

and then the snapshot event was published(like offeset removing sloution). and the connector began to work.
It is also a solution to publish duplicate events... :(

How does it work in a schema_only_recovery situation?
Is there something I'm missing?

Regards,
shani.kim

2020년 11월 23일 월요일 오후 5시 26분 9초 UTC+9에 jiri.p...@gmail.com님이 작성:

jiri.p...@gmail.com

unread,
Nov 24, 2020, 3:19:50 AM11/24/20
to debezium
Hi,

there is still the bug in the parser. Have you upgraded?

J.

shani.kim

unread,
Nov 24, 2020, 9:51:09 PM11/24/20
to debezium
Hi,

You mean 1.3 to 1.4? Not yet.

But I want to know how to safely recover the connector when the bug situation is not resolved.
Is there any way to recover the connector without publishing duplicate snapshot events?

Regards,
shani.kim
2020년 11월 24일 화요일 오후 5시 19분 50초 UTC+9에 jiri.p...@gmail.com님이 작성:

jiri.p...@gmail.com

unread,
Nov 27, 2020, 9:46:07 AM11/27/20
to debezium
Hi,

I think this is misunderstanding. The procedure you've described in the email above (with schema_only_recovery snapshot) is the exact solution. It just fails due to the not resolved bug. Otherwise it is working fine.

J.

shani.kim

unread,
Nov 30, 2020, 9:12:32 PM11/30/20
to debezium
Hi. J.
Thank you for constantly answering my questions.

To sum up what you said,

the schema_only_recovery snapshot procedure would be good in a bug-solved situation. right?

If I want to recover the connector when the bug is not resolved, Is the only solution to remove the offset?

Regards,
shani.kim
2020년 11월 27일 금요일 오후 11시 46분 7초 UTC+9에 jiri.p...@gmail.com님이 작성:

jiri.p...@gmail.com

unread,
Dec 9, 2020, 8:52:54 AM12/9/20
to debezium
Hi,

yes, exactly.

J.

Reply all
Reply to author
Forward
0 new messages