
HTTP/1.1 500 Internal Server Error
Date: Wed, 13 Feb 2019 06:56:33 GMT
Content-Type: application/json
Content-Length: 2345
Server: Jetty(9.4.12.v20180830)
{"error_code":500,"message":"Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class io.debezium.connector.mysql.MySqlConnector, name='io.debezium.connector.mysql.MySqlConnector', version='0.8.3.Final', encodedVersion=0.8.3.Final, type=source, typeName='source', location='file:/Users/ekhan/project/ek/kafka-debezium-mysql-prototype/kafka-connect/connectors/debezium-connector-mysql/'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0', encodedVersion=2.1.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0', encodedVersion=2.1.0, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0', encodedVersion=2.1.0, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0', encodedVersion=2.1.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0', encodedVersion=2.1.0, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0', encodedVersion=2.1.0, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0', encodedVersion=2.1.0, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0', encodedVersion=2.1.0, type=source, typeName='source', location='classpath'}"}
Please note that, I only have one "connect.distributed.properties" worker file. Do I need another one for sink? I didn't think so.
Also, is using Avro Schema a must for sink?
Thanks much, Jiri. Eagerly waiting for a response! :-)
-Ehsan
Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/478ae527-9d20-4e3e-ad54-b3875845bc02%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
# Examples:
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=connectors
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/ebe16361-bbcc-411b-9e21-998c5e598201%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Where are you running Kafka Connect from (which distribution), and how are you running it (docker etc?)Your sink connector is failing because:So you need to either install the JDBC Sink (https://www.confluent.io/connector/kafka-connect-jdbc/), or just run Kafka Connect as shipped with Confluent Platform which includes the JDBC Sink by default. There are also Docker images etc too.Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector
On Wed, 13 Feb 2019 at 08:22, Ehasnul Khan <ehsanul...@gmail.com> wrote:
Here is my setup looks like (I did remove PostGres..jar as I am using mysql for sink).
<jdbc setup 1.png>
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/CA%2BJsER1KFQsMauHsb6mHaBE8cWJqPq0BzccxJjkeGsy8WgdwWA%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/7D1356AF-CA90-4CDD-8EA8-79F1853B4434%40gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a688d231-8ed2-4943-9e09-178712e6c6ae%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
INFO Unable to connect to database on attempt 1/3. Will retry in 10000 ms. (io.confluent.connect.jdbc.util.CachedConnectionProvider:93)com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failureThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)Caused by: java.net.ConnectException: Connection refused (Connection refused)
[2019-02-17 10:56:25,015] INFO Successfully tested connection for jdbc:mysql://localhost:3306/?useInformationSchema=true&nullCatalogMeansCurrent=false&useSSL=false&useUnicode=true&characterEncoding=UTF-8&characterSetResults=UTF-8&zeroDateTimeBehavior=convertToNull with user 'erprep' (io.debezium.connector.mysql.MySqlConnector:101)
[2019-02-17 10:56:43,072] INFO JdbcSinkConfig values:connection.url = jdbc:mysql://localhost:3036/tgt_classicmodels
I was able to get Kafka-jdbc-sink connector running but unable that to talk to MySql database. I tried to exclusively setup (as Jiri suggested) Kafka-jdbc-connector but ended up creating a separate folder under the "connectors" directory (plugin.path=connectors ).Here is the excerpt from the log file:INFO Unable to connect to database on attempt 2/3. Will retry in 10000 ms. (io.confluent.connect.jdbc.util.CachedConnectionProvider:93)com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure...2019-02-17 10:57:03,280] ERROR WorkerSinkTask{id=mysql_target_sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. (org.apache.kafka.connect.runtime.WorkerSinkTask:585)org.apache.kafka.connect.errors.ConnectException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure...[2019-02-17 10:57:03,281] ERROR WorkerSinkTask{id=mysql_target_sink-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception....[2019-02-17 10:57:03,282] ERROR WorkerSinkTask{id=mysql_target_sink-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)[2019-02-17 10:57:03,282] INFO Stopping task (io.confluent.connect.jdbc.sink.JdbcSinkTask:104)Here is the curl command to start the Kafka-connect-jdbc for MySQL:
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d'{ "name": "mysql_target_sink", "config":{ "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector", "tasks.max": "1",
"topics": "mysql1.classicmodels.payments","connection.url":"jdbc:mysql://localhost:3036/tgt_classicmodels","connection.user":"erpsink", "connection.password":"erpsink",
"transforms": "unwrap", "transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope",
"pk.mode":"record_value", "pk.fields":"ID", "auto.create": "true", "insert.mode":"upsert","pk.fields": "id", "pk.mode": "record_value", "name": "mysql_target_sink"}}'Setup snapshot from IntelliJ and POM.xml:
Please note: I am running everything on my local machine (not using Docker or Confluent Platform, as you can see :-) )Attached is the full ERROR_LOG file.Question:1. Does POM needs to have an entry for Confluent JDBC Sink?2. Is my Curl config looks okay?Also, my source and target database schema is under the same MySQL Instance (running on my local machine).I have assigned full DBA rights to the connection.user.Please let me know your suggestion of where I should be looking at or what I have done wrong here.THANKS SO MUCH!!!-Ehsan
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a751d7cf-157a-4e7d-bd1d-fde5facaa7a3%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/CA%2BJsER2WeivDTpyKibvW3i9kjgj1C_EALkiGqn1T2GJZG-A2Hw%40mail.gmail.com.
[2019-02-19 21:14:11,720] ERROR WorkerSinkTask{id=mysql_tgt_productlines-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. (org.apache.kafka.connect.runtime.WorkerSinkTask:585)
org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: java.sql.BatchUpdateException: Data truncation: Data too long for column 'textDescription' at row 1
com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'textDescription' at row 1
CREATE TABLE `productlines` (
`productLine` varchar(50) NOT NULL,
`textDescription` varchar(4000) DEFAULT NULL,
`htmlDescription` mediumtext,
`image` mediumblob,
PRIMARY KEY (`productLine`)
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+unsubscribe@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a688d231-8ed2-4943-9e09-178712e6c6ae%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+unsubscribe@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a751d7cf-157a-4e7d-bd1d-fde5facaa7a3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a688d231-8ed2-4943-9e09-178712e6c6ae%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/a751d7cf-157a-4e7d-bd1d-fde5facaa7a3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/b86c8849-0d03-4340-9604-76c3032fb1b1%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/dbc74674-3509-44ca-b033-bd8c0e8208fb%40googlegroups.com.