unable to start Debezium Caused by: java.lang.ClassNotFoundException: io.debezium.util.IoUtil error

5,738 views
Skip to first unread message

ronnie10

unread,
Sep 25, 2018, 5:11:07 AM9/25/18
to debezium
Dear all,

I am trying to configure Debezium to stream Mysql binlog to Kafka. I am using Kafka 2.11-2.0.0 and Debezium connector mysql 0.8.3. I have configured connect-standalone.properties as follow:

bootstrap.servers=localhost:9092
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000
plugin.path=/opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql

and I have another file connect-mysql.properties as follow
name=mysql-cdc
connector.class=io.debezium.connector.mysql.MySqlConnector
tasks.max=1
database.hostname=192.168.1.1
database.port=3306
database.user=username
database.password=password
database.server.id=1
database.server.name=mysql-dev-01
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=testcdc

Zookeeper, Kafka is started with broker and topics, when I start debezium with the following command
bin/connect-standalone.sh config/connect-standalone.properties config/connect-mysql.properties

It gave me the following error at the end of the loading:
2018-09-25 16:50:29,341] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/antlr4-runtime-4.7.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2018-09-25 16:50:29,341] INFO Loading plugin from: /opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/mysql-binlog-connector-java-0.13.0.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:218)
[2018-09-25 16:50:29,348] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/mysql-binlog-connector-java-0.13.0.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2018-09-25 16:50:29,348] INFO Loading plugin from: /opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/mysql-connector-java-5.1.40.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:218)
[2018-09-25 16:50:29,386] INFO Registered loader: PluginClassLoader{pluginLocation=file:/opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/mysql-connector-java-5.1.40.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2018-09-25 16:50:29,401] INFO Loading plugin from: /opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql/debezium-connector-mysql-0.8.3.Final.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:218)
[2018-09-25 16:50:29,418] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:122)
java.lang.NoClassDefFoundError: io/debezium/util/IoUtil
    at io.debezium.connector.mysql.Module.<clinit>(Module.java:19)
    at io.debezium.connector.mysql.MySqlConnector.version(MySqlConnector.java:46)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.versionFor(DelegatingClassLoader.java:346)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.versionFor(DelegatingClassLoader.java:351)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.getPluginDesc(DelegatingClassLoader.java:328)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:309)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:240)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:232)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initPluginLoader(DelegatingClassLoader.java:201)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:178)
    at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:61)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:77)
Caused by: java.lang.ClassNotFoundException: io.debezium.util.IoUtil
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 12 more

Hope someone can shed some light on this for me, as the debezium-connector-mysql-0.8.3.Final.jar is inside the folder.

Thanks!

Gunnar Morling

unread,
Sep 25, 2018, 5:31:16 AM9/25/18
to debezium
You also need the debezium-core JAR. In fact all the JARs you find in the distribution archive of your Debezium connector

--Gunnar

ronnie10

unread,
Sep 25, 2018, 5:47:12 AM9/25/18
to debezium
Hi Gunnar,

I do have all the JAR files.
antlr4-runtime-4.7.jar 
debezium-connector-mysql-0.8.3.Final.jar
debezium-ddl-parser-0.8.3.Final.jar
mysql-binlog-connector-java-0.13.0.jar
debezium-core-0.8.3.Final.jar
mysql-connector-java-5.1.40.jar

Those file come in the tar file which I downloaded from debezium website.

-Ronnie10

Gunnar Morling

unread,
Sep 25, 2018, 5:54:14 AM9/25/18
to debezium
I think this is the problem:

    plugin.path=/opt/kafka_2.11-2.0.0/connect/debezium-connector-mysql

Please try this instead:

    plugin.path=/opt/kafka_2.11-2.0.0/connect

I.e. plug-in.path should point to the parent directory of individual connect plug-ins (directories or JARs). In your case it points to a specific plug-in's directory, so Connect tries to load plug-ins from each of the JARs contained within that directory instead of dealing the entire directory as a single plug-in, as it should.

Jiri Pechanec

unread,
Sep 25, 2018, 5:54:37 AM9/25/18
to debezium
Hi,

could you please post the whole log file?

J.

ronnie10

unread,
Sep 25, 2018, 6:01:55 AM9/25/18
to debezium
Oh Thanks Gunnar! It solve the issue, but another issue arise, do I need to set 'query_cache_size'?

[2018-09-25 17:57:09,719] INFO jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 1.8.0_181-b13 (org.eclipse.jetty.server.Server:374)
[2018-09-25 17:57:09,737] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:365)
[2018-09-25 17:57:09,737] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:370)
[2018-09-25 17:57:09,738] INFO node0 Scavenging every 660000ms (org.eclipse.jetty.server.session:149)
Sep 25, 2018 5:57:09 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored.
Sep 25, 2018 5:57:09 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored.
Sep 25, 2018 5:57:09 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored.
Sep 25, 2018 5:57:10 PM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.

[2018-09-25 17:57:10,010] INFO Started o.e.j.s.ServletContextHandler@4a6c18ad{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:851)
[2018-09-25 17:57:10,014] INFO Started http_8083@35ff8fc9{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:289)
[2018-09-25 17:57:10,014] INFO Started @1560ms (org.eclipse.jetty.server.Server:411)
[2018-09-25 17:57:10,014] INFO Advertised URI: http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:267)
[2018-09-25 17:57:10,015] INFO REST server listening at http://127.0.0.1:8083/, advertising URL http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:217)
[2018-09-25 17:57:10,015] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:55)
[2018-09-25 17:57:10,054] WARN The connection password is empty (io.debezium.connector.mysql.MySqlConnector:88)
[2018-09-25 17:57:10,217] INFO Failed testing connection for jdbc:mysql://192.168.1.1:3306/?useInformationSchema=true&nullCatalogMeansCurrent=false&useSSL=false&useUnicode=true&characterEncoding=UTF-8&characterSetResults=UTF-8&zeroDateTimeBehavior=convertToNull with user 'username' (io.debezium.connector.mysql.MySqlConnector:103)
[2018-09-25 17:57:10,219] ERROR Failed to create job for config/connect-mysql.properties (org.apache.kafka.connect.cli.ConnectStandalone:102)
[2018-09-25 17:57:10,220] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
Unable to connect: Unknown system variable 'query_cache_size'
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
    at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
    at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
Unable to connect: Unknown system variable 'query_cache_size'
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
    at org.apache.kafka.connect.runtime.AbstractHerder.maybeAddConfigErrors(AbstractHerder.java:415)
    at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107)
[2018-09-25 17:57:10,221] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
[2018-09-25 17:57:10,221] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223)
[2018-09-25 17:57:10,223] INFO Stopped http_8083@35ff8fc9{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:332)
[2018-09-25 17:57:10,223] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167)
[2018-09-25 17:57:10,227] INFO Stopped o.e.j.s.ServletContextHandler@4a6c18ad{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:1020)
[2018-09-25 17:57:10,228] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:241)
[2018-09-25 17:57:10,228] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95)
[2018-09-25 17:57:10,228] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:184)
[2018-09-25 17:57:10,228] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:67)
[2018-09-25 17:57:10,228] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:205)
[2018-09-25 17:57:10,229] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:112)
[2018-09-25 17:57:10,229] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)



Ronnie10

ronnie10

unread,
Sep 25, 2018, 6:02:28 AM9/25/18
to debezium
Hi Jiri, Managed to get the error resolve, but another issue arise.

Ronnie10

Jiri Pechanec

unread,
Sep 25, 2018, 6:18:51 AM9/25/18
to debezium
Could you please upgrade MySQL JDBC driver to the latest version?

J.
Message has been deleted
Message has been deleted

ronnie10

unread,
Sep 25, 2018, 11:32:35 PM9/25/18
to debezium
Hi Jiri,

I managed to solved all the permission issue, now another error arise, caused by CharsetMapping:

[2018-09-26 11:27:36,808] INFO Step 7: committing transaction (io.debezium.connector.mysql.SnapshotReader:639)
[2018-09-26 11:27:36,809] INFO Step 8: releasing global read lock to enable MySQL writes (io.debezium.connector.mysql.SnapshotReader:654)
[2018-09-26 11:27:36,811] INFO Writes to MySQL tables prevented for a total of 00:00:00.753 (io.debezium.connector.mysql.SnapshotReader:664)
[2018-09-26 11:27:36,811] ERROR Failed due to error: Aborting snapshot due to error when last running 'UNLOCK TABLES': com/mysql/jdbc/CharsetMapping (io.debezium.connector.mysql.SnapshotReader:179)
org.apache.kafka.connect.errors.ConnectException: com/mysql/jdbc/CharsetMapping

    at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:200)
    at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:178)
    at io.debezium.connector.mysql.SnapshotReader.execute(SnapshotReader.java:709)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: com/mysql/jdbc/CharsetMapping
    at io.debezium.connector.mysql.MySqlValueConverters.charsetFor(MySqlValueConverters.java:304)
    at io.debezium.connector.mysql.MySqlValueConverters.converter(MySqlValueConverters.java:272)
    at io.debezium.relational.TableSchemaBuilder.createValueConverterFor(TableSchemaBuilder.java:331)
    at io.debezium.relational.TableSchemaBuilder.convertersForColumns(TableSchemaBuilder.java:254)
    at io.debezium.relational.TableSchemaBuilder.createValueGenerator(TableSchemaBuilder.java:184)
    at io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:123)
    at io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:112)
    at io.debezium.connector.mysql.MySqlSchema.lambda$applyDdl$3(MySqlSchema.java:361)
    at java.lang.Iterable.forEach(Iterable.java:75)
    at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:355)
    at io.debezium.connector.mysql.SnapshotReader.lambda$execute$12(SnapshotReader.java:441)
    at io.debezium.jdbc.JdbcConnection.query(JdbcConnection.java:412)
    at io.debezium.jdbc.JdbcConnection.query(JdbcConnection.java:353)
    at io.debezium.connector.mysql.SnapshotReader.execute(SnapshotReader.java:439)
    ... 3 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.CharsetMapping

    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 17 more
[2018-09-26 11:27:36,930] INFO WorkerSourceTask{id=mysql-cdc-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:397)
[2018-09-26 11:27:36,930] INFO WorkerSourceTask{id=mysql-cdc-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:414)
[2018-09-26 11:27:36,941] INFO WorkerSourceTask{id=mysql-cdc-0} Finished commitOffsets successfully in 11 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:496)
[2018-09-26 11:27:36,941] ERROR WorkerSourceTask{id=mysql-cdc-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
org.apache.kafka.connect.errors.ConnectException: com/mysql/jdbc/CharsetMapping
    at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:200)
    at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:178)
    at io.debezium.connector.mysql.SnapshotReader.execute(SnapshotReader.java:709)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: com/mysql/jdbc/CharsetMapping
    at io.debezium.connector.mysql.MySqlValueConverters.charsetFor(MySqlValueConverters.java:304)
    at io.debezium.connector.mysql.MySqlValueConverters.converter(MySqlValueConverters.java:272)
    at io.debezium.relational.TableSchemaBuilder.createValueConverterFor(TableSchemaBuilder.java:331)
    at io.debezium.relational.TableSchemaBuilder.convertersForColumns(TableSchemaBuilder.java:254)
    at io.debezium.relational.TableSchemaBuilder.createValueGenerator(TableSchemaBuilder.java:184)
    at io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:123)
    at io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:112)
    at io.debezium.connector.mysql.MySqlSchema.lambda$applyDdl$3(MySqlSchema.java:361)
    at java.lang.Iterable.forEach(Iterable.java:75)
    at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:355)
    at io.debezium.connector.mysql.SnapshotReader.lambda$execute$12(SnapshotReader.java:441)
    at io.debezium.jdbc.JdbcConnection.query(JdbcConnection.java:412)
    at io.debezium.jdbc.JdbcConnection.query(JdbcConnection.java:353)
    at io.debezium.connector.mysql.SnapshotReader.execute(SnapshotReader.java:439)
    ... 3 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.CharsetMapping

    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 17 more
[2018-09-26 11:27:36,943] ERROR WorkerSourceTask{id=mysql-cdc-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)

Ronnie10

ronnie10

unread,
Sep 26, 2018, 3:19:45 AM9/26/18
to debezium
Hi Jiri,

Everything seems starting properly except this small issue now, where can I set the Schema_only)recovery?

[2018-09-26 15:09:41,128] INFO WorkerSourceTask{id=mysql-cdc-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:414)
[2018-09-26 15:09:41,128] ERROR WorkerSourceTask{id=mysql-cdc-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
org.apache.kafka.connect.errors.ConnectException: The db history topic is missing. You may attempt to recover it by reconfiguring the connector to SCHEMA_ONLY_RECOVERY
    at io.debezium.connector.mysql.MySqlConnectorTask.start(MySqlConnectorTask.java:90)
    at io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:45)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:198)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
[2018-09-26 15:09:41,129] ERROR WorkerSourceTask{id=mysql-cdc-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)


On Tuesday, 25 September 2018 18:18:51 UTC+8, Jiri Pechanec wrote:

Jiri Pechanec

unread,
Sep 26, 2018, 3:22:29 AM9/26/18
to debezium
Hi,

it should be added to connect-standalone.properties, see snapshot.mode config property in https://debezium.io/docs/connectors/mysql/

J.

ronnie10

unread,
Sep 26, 2018, 3:50:39 AM9/26/18
to debezium
Hi Jiri,

How do it put in the parameter?

include.schema.events=true and
schema_only_recovery = true or
schema.only.recovery = true

I tried the above, all gave me same result as before.

Ronnie10

Jiri Pechanec

unread,
Sep 26, 2018, 3:58:40 AM9/26/18
to debezium
Hi,

try

snapshot.mode=schema_only_recovery

J.

ronnie10

unread,
Sep 26, 2018, 4:07:41 AM9/26/18
to debezium
Thanks Jiri,

Now I am back with this error, I check in the jar file, it seems that the charsetmapping class file is inside the jar file though.

[2018-09-26 16:04:36,253] ERROR WorkerSourceTask{id=mysql-cdc-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
org.apache.kafka.connect.errors.ConnectException: java.lang.NoClassDefFoundError: com/mysql/jdbc/CharsetMapping
    at io.debezium.connector.mysql.MySqlConnectorTask.start(MySqlConnectorTask.java:219)

    at io.debezium.connector.common.BaseSourceTask.start(BaseSourceTask.java:45)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:198)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: com/mysql/jdbc/CharsetMapping
    at io.debezium.connector.mysql.MySqlValueConverters.charsetFor(MySqlValueConverters.java:304)
    at io.debezium.connector.mysql.MySqlValueConverters.converter(MySqlValueConverters.java:272)
    at io.debezium.relational.TableSchemaBuilder.createValueConverterFor(TableSchemaBuilder.java:331)
    at io.debezium.relational.TableSchemaBuilder.convertersForColumns(TableSchemaBuilder.java:254)
    at io.debezium.relational.TableSchemaBuilder.createValueGenerator(TableSchemaBuilder.java:184)
    at io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:123)
    at io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:112)
    at io.debezium.connector.mysql.MySqlSchema.lambda$refreshSchemas$1(MySqlSchema.java:271)
    at java.util.concurrent.ConcurrentHashMap$KeySetView.forEach(ConcurrentHashMap.java:4649)
    at java.util.Collections$UnmodifiableCollection.forEach(Collections.java:1080)
    at io.debezium.connector.mysql.MySqlSchema.refreshSchemas(MySqlSchema.java:269)
    at io.debezium.connector.mysql.MySqlSchema.loadHistory(MySqlSchema.java:246)
    at io.debezium.connector.mysql.MySqlTaskContext.loadHistory(MySqlTaskContext.java:163)
    at io.debezium.connector.mysql.MySqlConnectorTask.start(MySqlConnectorTask.java:96)
    ... 9 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.CharsetMapping

    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 23 more
[2018-09-26 16:04:36,254] ERROR WorkerSourceTask{id=mysql-cdc-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)

Ronnie10

Jiri Pechanec

unread,
Sep 26, 2018, 4:19:02 AM9/26/18
to debezium
Hmm, I understand it know. I ma sorry for not spotting it earlier. MySQL 8 is supported only in current master/0.9 stream. If you can'tuse MySQL 5.7 you need to use nigthly build if you are still interested in testing it.

J.

ronnie10

unread,
Sep 26, 2018, 4:26:28 AM9/26/18
to debezium
Hi Jiri,

Thanks for informing, I am going to try it out the latest nightly build

somehow something came through with my current setting, I received something like this in kafka consumer instead of the row entry of the table, anything that I missed out?

{
  "source" : {
    "server" : "mysql-dev-01"
  },
  "position" : {
    "file" : "mysql-bin.000227",
    "pos" : 89312693,
    "gtids" : "ba4acaf1-dbeb-11e7-b241-000c29abbdba:1-29005839",
    "snapshot" : true
  },
  "databaseName" : "da_testing_db",
  "ddl" : "USE `da_testing_db`"
}

my latest
name=mysql-cdc
connector.class=io.debezium.connector.mysql.MySqlConnector
database.hostname=192.168.1.1
database.port=3306
database.user=user

database.password=password
database.server.id=1
database.server.name=mysql-dev-01
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=testcdc
database.zeroDateTimeBehavior=convert_To_Null
database.whitelist=da_testing_db
table.whitelist=da.testing_db.testcdc

Ronnie10

Jiri Pechanec

unread,
Sep 26, 2018, 4:38:37 AM9/26/18
to debezium
You are reading a wrong topic - this one is used by Kafka Connect to store offsets. When you will get the streaming running you will have topic names in format <servername>.<databasename>.<tablename>.

J.

ronnie10

unread,
Sep 26, 2018, 5:36:02 AM9/26/18
to debezium
I have only one topic created on my Kafka which is testcdc. I am not getting it stream over, i need to opt in --from-beginning then only i can see it in my consumer

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic schema-changes.da_testing_db --from-beginning

Do I need to create another topics to keep the stream data?


I have configure 0.9 on my test setup, now it starts without issue.

I have updated my mysql.property file as follow, but it is still the same.

name=mysql-cdc
connector.class=io.debezium.connector.mysql.MySqlConnector

database.hostname=192.168.1.1
database.port=3306
database.user=username
database.password=password
database.server.id=1
database.server.name=mysql-dev-01
database.whitelist=da_testing_db
table.whitelist=da.testing_db.testcdc
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=testcdc
database.zeroDateTimeBehavior=convert_To_Null
database.history.store.only.monitored.tables.ddl=true
include.schema.changes=true

Jiri Pechanec

unread,
Sep 26, 2018, 6:23:55 AM9/26/18
to debezium
Could you please try `kafka-topics.sh --list`?

Also I recommend to start without whitelist/blacklist setting at the beginning as it introduces unnecessary running part in beginners phase.

J.

ronnie10

unread,
Sep 26, 2018, 9:33:05 PM9/26/18
to debezium
I have changed my connect-mysql.properties to following:

name=mysql-cdc
tasks.max=1

connector.class=io.debezium.connector.mysql.MySqlConnector
database.hostname=192.168.1.1
database.port=3306
database.user=user
database.password=password
database.server.id=1
database.server.name=mysql-dev-01
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=mysql-dev.da_testing_db.testcdc

when I list the topic, i see the following:
bin/kafka-topics.sh --list --zookeeper localhost:2181
mysql-dev-01
mysql-dev-01.da_testing_db.testcdc
__consumer_offsets
schema-changes.da_testing_db
testcdc

when I run bin/connect-standalone.sh config/connect-standalone.properties config/connect-mysql.properties, I can see millions of the following lines:

[2018-09-27 09:22:58,078] WARN Skipping invalid database history record '{
  "schema" : {
    "type" : "struct",
    "fields" : [ {
      "type" : "struct",
      "fields" : [ {
        "type" : "int64",
        "optional" : false,
        "field" : "id"
      }, {
        "type" : "int64",
        "optional" : true,
        "field" : "parent_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "operator_id"
      }, {
        "type" : "int32",
        "optional" : true,
        "field" : "player_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "game_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "platform"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "currency"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "pay_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "6",
          "connect.decimal.precision" : "13"
        },
        "field" : "sales_contribution_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "sales_rtp_contribution_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "promotion_amount"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "bet_status"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "bet_type"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "transaction_type"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "game_stage"
      }, {
        "type" : "int64",
        "optional" : true,
        "name" : "io.debezium.time.MicroTimestamp",
        "version" : 1,
        "field" : "create_time"
      }, {
        "type" : "int64",
        "optional" : true,
        "name" : "io.debezium.time.MicroTimestamp",
        "version" : 1,
        "field" : "update_time"
      }, {
        "type" : "boolean",
        "optional" : true,
        "field" : "is_deduct"
      } ],
      "optional" : true,
      "name" : "mysql_dev_01.da_testing_db.testcdc.Value",
      "field" : "before"
    }, {
      "type" : "struct",
      "fields" : [ {
        "type" : "int64",
        "optional" : false,
        "field" : "id"
      }, {
        "type" : "int64",
        "optional" : true,
        "field" : "parent_sales_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "operator_id"
      }, {
        "type" : "int32",
        "optional" : true,
        "field" : "customer_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "product_id"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "platform"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "currency"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "sales_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "profit_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "6",
          "connect.decimal.precision" : "13"
        },
        "field" : "sales_contribution_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "sales_rtp_contribution_amount"
      }, {
        "type" : "bytes",
        "optional" : true,
        "name" : "org.apache.kafka.connect.data.Decimal",
        "version" : 1,
        "parameters" : {
          "scale" : "2",
          "connect.decimal.precision" : "18"
        },
        "field" : "sales_profit_amount"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "sales_status"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "product_type"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "transaction_type"
      }, {
        "type" : "int16",
        "optional" : true,
        "field" : "sales_stage"
      }, {
        "type" : "int64",
        "optional" : true,
        "name" : "io.debezium.time.MicroTimestamp",
        "version" : 1,
        "field" : "create_time"
      }, {
        "type" : "int64",
        "optional" : true,
        "name" : "io.debezium.time.MicroTimestamp",
        "version" : 1,
        "field" : "update_time"
      }, {
        "type" : "boolean",
        "optional" : true,
        "field" : "is_deduct"
      } ],
      "optional" : true,
      "name" : "mysql_dev_01.da_testing_db.testcdc.Value",
      "field" : "after"
    }, {
      "type" : "struct",
      "fields" : [ {
        "type" : "string",
        "optional" : true,
        "field" : "version"
      }, {
        "type" : "string",
        "optional" : false,
        "field" : "name"
      }, {
        "type" : "int64",
        "optional" : false,
        "field" : "server_id"
      }, {
        "type" : "int64",
        "optional" : false,
        "field" : "ts_sec"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "gtid"
      }, {
        "type" : "string",
        "optional" : false,
        "field" : "file"
      }, {
        "type" : "int64",
        "optional" : false,
        "field" : "pos"
      }, {
        "type" : "int32",
        "optional" : false,
        "field" : "row"
      }, {
        "type" : "boolean",
        "optional" : true,
        "default" : false,
        "field" : "snapshot"
      }, {
        "type" : "int64",
        "optional" : true,
        "field" : "thread"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "db"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "table"
      }, {
        "type" : "string",
        "optional" : true,
        "field" : "query"
      } ],
      "optional" : false,
      "name" : "io.debezium.connector.mysql.Source",
      "field" : "source"
    }, {
      "type" : "string",
      "optional" : false,
      "field" : "op"
    }, {
      "type" : "int64",
      "optional" : true,
      "field" : "ts_ms"
    } ],
    "optional" : false,
    "name" : "mysql_dev_01.da_testing_db.testcdc.Envelope"
  },
  "payload" : {
    "before" : null,
    "after" : {
      "id" : 0,
      "parent_id" : 2,
      "operator_id" : 1,
      "player_id" : 40010,
      "game_id" : 17,
      "platform" : 1,
      "currency" : "EUR",
      "sales_amount" : "ANbY",
      "profit_amount" : "dTA=",
      "sales_contribution_amount" : "AA==",
      "sales_rtp_contribution_amount" : "AA==",
      "sales_profit_amount" : "AA==",
      "sales_status" : 3,
      "product_type" : 1,
      "transaction_type" : 1,
      "sales_stage" : 0,
      "create_time" : null,
      "update_time" : null,
      "is_deduct" : true
    },
    "source" : {
      "version" : "0.9.0-SNAPSHOT",
      "name" : "MYSQL-DEV-01",
      "server_id" : 0,
      "ts_sec" : 0,
      "gtid" : null,
      "file" : "mysql-bin.000228",
      "pos" : 669870889,
      "row" : 0,
      "snapshot" : true,
      "thread" : null,
      "db" : "da_testing_db",
      "table" : "testcdc",
      "query" : null
    },
    "op" : "c",
    "ts_ms" : 1537954045005
  }
}'. This is often not an issue, but if it happens repeatedly please check the 'mysql-dev-01.da_testing_db.testcdc' topic. (io.debezium.relational.history.KafkaDatabaseHistory:234)


I thought I should get those information on my consumer, but it is not, and when I pump data into my mysql server, I did not see any real time streaming coming through as well.


Ronnie10

ronnie10

unread,
Sep 27, 2018, 2:35:27 AM9/27/18
to debezium
Hi Jiri,

After changing the connect-mysql.properties to the following and restarting my kafka connector

name=mysql-cdc
tasks.max=1
connector.class=io.debezium.connector.mysql.MySqlConnector
database.hostname=10.13.1.18

database.port=3306
database.user=user
database.password=password
database.server.id=1
database.server.name=mysql-dev-01
database.whitelist=da_testing_db
#table.whitelist=da.testing_db.testcdc
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=mysql-dev-01.da_testing_db.testcdc
snapshot.mode=SCHEMA_ONLY_RECOVERY


I saw the following error, is there any other thing that i missed out?
[2018-09-27 14:30:18,783] INFO WorkerSourceTask{id=mysql-cdc-0} Finished commitOffsets successfully in 7 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:496)
[2018-09-27 14:30:18,785] ERROR WorkerSourceTask{id=mysql-cdc-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
org.apache.kafka.connect.errors.ConnectException: The replication sender thread cannot start in AUTO_POSITION mode: this server has GTID_MODE = OFF instead of ON. Error code: 1236; SQLSTATE: HY000.
    at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:200)
    at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:167)
    at io.debezium.connector.mysql.BinlogReader$ReaderThreadLifecycleListener.onCommunicationFailure(BinlogReader.java:957)
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:921)
    at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559)
    at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793)
    at java.lang.Thread.run(Thread.java:748)
Caused by: com.github.shyiko.mysql.binlog.network.ServerException: The replication sender thread cannot start in AUTO_POSITION mode: this server has GTID_MODE = OFF instead of ON.
    at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:882)
    ... 3 more
[2018-09-27 14:30:18,789] ERROR WorkerSourceTask{id=mysql-cdc-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)


Ronnie10
Reply all
Reply to author
Forward
0 new messages