Unable to get Kafka Connect working on SSL

12,045 views
Skip to first unread message

Binoy Thomas

unread,
Jun 23, 2017, 3:11:34 PM6/23/17
to Confluent Platform
Below are the changes that have been made so far to support SSL (changes on Kafka, Schema Registry and Connect)


############################# Kafka server.properties changes for SSL support #############################

listeners=SSL://localhost:9093
advertised.listeners=SSL://localhost:9093

############################# Schema Registry schema-registry.properties changes for SSL support #############################

kafkastore.security.protocol=SSL

ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
ssl.truststore.password=password
ssl.client.auth=true

kafkastore.ssl.key.password=password
kafkastore.ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
kafkastore.ssl.keystore.password=password
kafkastore.ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
kafkastore.ssl.truststore.password=password

############################# Sample Test  to query Schema Registry #############################
curl -X GET -k -E /Users/binoy.thomas/keystores/kafka.client.keystore.pem:password https://localhost:8081/subjects

["topic1-value","topic2-value","topic3-value","topic4-value","topic5-value"]

############################# Kafka Connect connect-avro-standalone.properties  changes for SSL support #############################
bootstrap.servers=localhost:9093
key.converter.schema.registry.url=https://localhost:8081
value.converter.schema.registry.url=https://localhost:8081

# Worker authentication settings
security.protocol=SSL
ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
ssl.truststore.password=password

# Source authentication settings
producer.security.protocol=SSL
producer.ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
producer.ssl.keystore.password=password
producer.ssl.key.password=password
producer.ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
producer.ssl.truststore.password=password

# Sink authentication settings
consumer.security.protocol=SSL
consumer.ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
consumer.ssl.keystore.password=password
consumer.ssl.key.password=password
consumer.ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
consumer.ssl.truststore.password=password

############################# Running standalone connect #############################

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/source-mysql-tmpthr-flowsheet-results.properties

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka-connect-elasticsearch/slf4j-simple-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka-connect-hdfs/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka-connect-s3/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka-connect-storage-common/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/binoy.thomas/confluent-3.2.1/share/java/kafka/slf4j-log4j12-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[2017-06-23 14:08:28,374] INFO StandaloneConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9093]
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class io.confluent.connect.avro.AvroConverter
offset.storage.file.filename = /tmp/connect.offsets
rest.advertised.port = null
rest.port = 8083
value.converter = class io.confluent.connect.avro.AvroConverter
 (org.apache.kafka.connect.runtime.standalone.StandaloneConfig:180)
[2017-06-23 14:08:28,481] INFO Logging initialized @752ms (org.eclipse.jetty.util.log:186)
[2017-06-23 14:08:28,734] INFO AvroConverterConfig values:
schema.registry.url = [https://localhost:8081]
max.schemas.per.subject = 1000
 (io.confluent.connect.avro.AvroConverterConfig:169)
[2017-06-23 14:08:28,850] INFO AvroDataConfig values:
schemas.cache.config = 1000
enhanced.avro.schema.support = false
connect.meta.data = true
 (io.confluent.connect.avro.AvroDataConfig:169)
[2017-06-23 14:08:28,852] INFO AvroConverterConfig values:
schema.registry.url = [https://localhost:8081]
max.schemas.per.subject = 1000
 (io.confluent.connect.avro.AvroConverterConfig:169)
[2017-06-23 14:08:28,852] INFO AvroDataConfig values:
schemas.cache.config = 1000
enhanced.avro.schema.support = false
connect.meta.data = true
 (io.confluent.connect.avro.AvroDataConfig:169)
[2017-06-23 14:08:28,866] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50)
[2017-06-23 14:08:28,866] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:71)
[2017-06-23 14:08:28,866] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:119)
[2017-06-23 14:08:28,867] INFO Starting FileOffsetBackingStore with file /tmp/connect.offsets (org.apache.kafka.connect.storage.FileOffsetBackingStore:60)
[2017-06-23 14:08:28,868] INFO Worker started (org.apache.kafka.connect.runtime.Worker:124)
[2017-06-23 14:08:28,869] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:73)
[2017-06-23 14:08:28,869] INFO Starting REST server (org.apache.kafka.connect.runtime.rest.RestServer:98)
[2017-06-23 14:08:28,996] INFO jetty-9.2.15.v20160210 (org.eclipse.jetty.server.Server:327)
Jun 23, 2017 2:08:29 PM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.

[2017-06-23 14:08:29,591] INFO Started o.e.j.s.ServletContextHandler@4b770e40{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:744)
[2017-06-23 14:08:29,613] INFO Started ServerConnector@271f18d3{HTTP/1.1}{0.0.0.0:8083} (org.eclipse.jetty.server.ServerConnector:266)
[2017-06-23 14:08:29,614] INFO Started @1886ms (org.eclipse.jetty.server.Server:379)
[2017-06-23 14:08:29,614] INFO REST server listening at http://10.20.20.90:8083/, advertising URL http://10.20.20.90:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:150)
[2017-06-23 14:08:29,614] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
Fri Jun 23 14:08:29 CDT 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Fri Jun 23 14:08:29 CDT 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[2017-06-23 14:08:29,939] INFO ConnectorConfig values:
connector.class = io.confluent.connect.jdbc.JdbcSourceConnector
key.converter = null
name = ds_FlowsheetResult-pieces2-thr
tasks.max = 10
transforms = null
value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:180)
[2017-06-23 14:08:29,939] INFO Creating connector ds_FlowsheetResult-pieces2-thr of type io.confluent.connect.jdbc.JdbcSourceConnector (org.apache.kafka.connect.runtime.Worker:178)
[2017-06-23 14:08:29,941] INFO Instantiated connector ds_FlowsheetResult-pieces2-thr with version 3.2.1 of type class io.confluent.connect.jdbc.JdbcSourceConnector (org.apache.kafka.connect.runtime.Worker:181)
[2017-06-23 14:08:29,942] INFO JdbcSourceConnectorConfig values:
batch.max.rows = 100
connection.password = null
connection.url = jdbc:mysql://localhost:3306/tmpthr?user=connect&password=connect123
connection.user = null
incrementing.column.name = FlowsheetResultRowID
mode = timestamp+incrementing
numeric.precision.mapping = false
query = select * from STG_PIECES_flowsheetresult
schema.pattern = null
table.blacklist = []
table.types = [TABLE]
table.whitelist = []
timestamp.column.name = UpdateDate
topic.prefix = ds_FlowsheetResult-pieces2-thr
validate.non.null = true
 (io.confluent.connect.jdbc.source.JdbcSourceConnectorConfig:180)
Fri Jun 23 14:08:29 CDT 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[2017-06-23 14:08:29,947] INFO Finished creating connector ds_FlowsheetResult-pieces2-thr (org.apache.kafka.connect.runtime.Worker:194)
[2017-06-23 14:08:29,947] INFO SourceConnectorConfig values:
connector.class = io.confluent.connect.jdbc.JdbcSourceConnector
key.converter = null
name = ds_FlowsheetResult-pieces2-thr
tasks.max = 10
transforms = null
value.converter = null
 (org.apache.kafka.connect.runtime.SourceConnectorConfig:180)
[2017-06-23 14:08:29,949] INFO Creating task ds_FlowsheetResult-pieces2-thr-0 (org.apache.kafka.connect.runtime.Worker:305)
[2017-06-23 14:08:29,949] INFO ConnectorConfig values:
connector.class = io.confluent.connect.jdbc.JdbcSourceConnector
key.converter = null
name = ds_FlowsheetResult-pieces2-thr
tasks.max = 10
transforms = null
value.converter = null
 (org.apache.kafka.connect.runtime.ConnectorConfig:180)
[2017-06-23 14:08:29,950] INFO TaskConfig values:
task.class = class io.confluent.connect.jdbc.source.JdbcSourceTask
 (org.apache.kafka.connect.runtime.TaskConfig:180)
[2017-06-23 14:08:29,951] INFO Instantiated task ds_FlowsheetResult-pieces2-thr-0 with version 3.2.1 of type io.confluent.connect.jdbc.source.JdbcSourceTask (org.apache.kafka.connect.runtime.Worker:317)
[2017-06-23 14:08:29,965] INFO ProducerConfig values:
acks = all
batch.size = 16384
block.on.buffer.full = false
bootstrap.servers = [localhost:9093]
buffer.memory = 33554432
compression.type = none
interceptor.classes = null
key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
max.block.ms = 9223372036854775807
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metric.reporters = []
metrics.num.samples = 2
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
retries = 2147483647
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.mechanism = GSSAPI
security.protocol = SSL
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = /Users/binoy.thomas/keystores/kafka.client.keystore.jks
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = /Users/binoy.thomas/truststores/kafka.client.truststore.jks
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
timeout.ms = 30000
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
 (org.apache.kafka.clients.producer.ProducerConfig:180)
[2017-06-23 14:08:30,062] INFO Kafka version : 0.10.2.1-cp1 (org.apache.kafka.common.utils.AppInfoParser:83)
[2017-06-23 14:08:30,062] INFO Kafka commitId : 078e7dc02a100018 (org.apache.kafka.common.utils.AppInfoParser:84)
[2017-06-23 14:08:30,066] INFO Created connector ds_FlowsheetResult-pieces2-thr (org.apache.kafka.connect.cli.ConnectStandalone:90)
[2017-06-23 14:08:30,066] INFO JdbcSourceTaskConfig values:
batch.max.rows = 100
connection.password = null
connection.url = jdbc:mysql://localhost:3306/tmpthr?user=connect&password=connect123
connection.user = null
incrementing.column.name = FlowsheetResultRowID
mode = timestamp+incrementing
numeric.precision.mapping = false
query = select * from STG_PIECES_flowsheetresult
schema.pattern = null
table.blacklist = []
table.types = [TABLE]
table.whitelist = []
tables = []
timestamp.column.name = UpdateDate
topic.prefix = ds_FlowsheetResult-pieces2-thr
validate.non.null = true
 (io.confluent.connect.jdbc.source.JdbcSourceTaskConfig:180)
[2017-06-23 14:08:30,092] INFO Source task WorkerSourceTask{id=ds_FlowsheetResult-pieces2-thr-0} finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:142)
Fri Jun 23 14:08:30 CDT 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[2017-06-23 14:08:30,311] ERROR Failed to send HTTP request to endpoint: https://localhost:8081/subjects/ds_FlowsheetResult-pieces2-thr-value/versions (io.confluent.kafka.schemaregistry.client.rest.RestService:146)
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1949)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1509)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:979)
at sun.security.ssl.Handshaker.process_record(Handshaker.java:914)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1062)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1283)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1258)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:250)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:142)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:188)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:245)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:237)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:232)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)
at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:103)
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:73)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:197)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:167)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
... 31 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 37 more
[2017-06-23 14:08:30,313] ERROR Task ds_FlowsheetResult-pieces2-thr-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:141)
org.apache.kafka.connect.errors.DataException: ds_FlowsheetResult-pieces2-thr
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:75)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:197)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:167)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1949)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1509)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:979)
at sun.security.ssl.Handshaker.process_record(Handshaker.java:914)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1062)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1283)
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1258)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:250)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:142)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:188)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:245)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:237)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:232)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)
at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:103)
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:73)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:197)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:167)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
... 31 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 37 more
[2017-06-23 14:08:30,315] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:142)
[2017-06-23 14:08:30,315] INFO Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:689)
[2017-06-23 14:08:39,486] INFO Reflections took 10557 ms to scan 561 urls, producing 13097 keys and 85890 values  (org.reflections.Reflections:229)










Binoy Thomas

unread,
Jun 23, 2017, 3:24:55 PM6/23/17
to Confluent Platform
An update to the original post for server.properties changes


############################# Kafka server.properties changes for SSL support #############################

listeners=SSL://localhost:9093
advertised.listeners=SSL://localhost:9093

ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.server.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.server.truststore.jks
ssl.truststore.password=password
ssl.client.auth=required
security.inter.broker.protocol=SSL

Binoy Thomas

unread,
Jun 24, 2017, 7:16:13 PM6/24/17
to Confluent Platform
As mentioned in the above comments - there does not seem to be an issue with the Schema Registry to Kafka Broker node handshakes.
Also there does not seem to be an issue when using curl to get info from the Schema Registry using the .pem file generated of the client keystore.

The issue seems occur when Connect or for example the kafka-avro-console-producer (see below) tries to connect to the Schema Registry

./bin/kafka-avro-console-producer --broker-list localhost:9093 --topic someTopic --property schema.registry.url=https://localhost:8081 --property value.schema='someSchema' --producer.config=/Users/binoy.thomas/tmp/producer.properties

Content of producer.properties is as follows

security.protocol=SSL
ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
ssl.truststore.password=password

[2017-06-24 17:57:45,354] ERROR Failed to send HTTP request to endpoint: https://localhost:8081/subjects/ds_EncounterObservationGroupWindow-thr-value/versions (io.confluent.kafka.schemaregistry.client.rest.RestService:146)
at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:158)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
... 23 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 29 more
at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:158)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1491)
... 23 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 29 more

Binoy Thomas

unread,
Jun 26, 2017, 5:45:42 PM6/26/17
to Confluent Platform
Wanted to provide some more updates - Cooked up a SpringBoot client and enabled SSL logging.... Looks like the truststore being used is that of the JVM and not the one specified in the application properties.

Is this a bug or am I missing some additional properties?

# SSL
security.protocol=SSL
ssl.keystore.location=/Users/binoy.thomas/keystores/kafka.client.keystore.jks
ssl.keystore.password=password
ssl.key.password=password
ssl.truststore.location=/Users/binoy.thomas/truststores/kafka.client.truststore.jks
ssl.truststore.password=password

StreamThread-1, WRITE: TLSv1.2 Application Data, length = 4
StreamThread-1, WRITE: TLSv1.2 Application Data, length = 171
StreamThread-1, WRITE: TLSv1.2 Application Data, length = 4
StreamThread-1, WRITE: TLSv1.2 Application Data, length = 187
StreamThread-1, WRITE: TLSv1.2 Application Data, length = 4
StreamThread-1, WRITE: TLSv1.2 Application Data, length = 187
keyStore is : 
keyStore type is : jks
keyStore provider is : 
init keystore
init keymanager of type SunX509
trustStore is: /Library/Java/JavaVirtualMachines/jdk1.8.0_74.jdk/Contents/Home/jre/lib/security/cacerts
trustStore type is : jks
trustStore provider is : 
init truststore
adding as trusted cert:
  Subject: CN=Equifax Secure Global eBusiness CA-1, O=Equifax Secure Inc., C=US
  Issuer:  CN=Equifax Secure Global eBusiness CA-1, O=Equifax Secure Inc., C=US
  Algorithm: RSA; Serial number: 0xc3517
  Valid from Sun Jun 20 23:00:00 CDT 1999 until Sun Jun 21 23:00:00 CDT 2020

On Friday, June 23, 2017 at 2:11:34 PM UTC-5, Binoy Thomas wrote:

Binoy Thomas

unread,
Jun 26, 2017, 6:55:30 PM6/26/17
to Confluent Platform
Ok.... So finally got all the components to talk to each other (Connect, Schema Registry, Kafka Brokers and Kafka Clients (producers & consumers)) by doing the following

1. Importing the root CA into the JVM truststore as the custom truststore configs didn't seem to get honored
2. Disabling client authentication with the schema registry (which is the default by the way). I had enabled it initially to have both client and server authenticate each other.



On Friday, June 23, 2017 at 2:11:34 PM UTC-5, Binoy Thomas wrote:

Osvald Ivarsson

unread,
Nov 28, 2017, 11:01:33 AM11/28/17
to Confluent Platform
I had the same problem where the schema-registry client in kafka-avro-console-consumer didn't respect the ssl properties being passed in for the Kafka client. But I managed to get it running for the schema-registry client as well by specifying the paths and passwords to the keystore and the truststore directly as JVM options:

SCHEMA_REGISTRY_OPTS="-Djavax.net.ssl.keyStore=/path/to/client.keystore.jks -Djavax.net.ssl.trustStore=/path/to/client.truststore.jks -Djavax.net.ssl.keyStorePassword=pass1234 -Djavax.net.ssl.trustStorePassword=pass1234" ./bin/kafka-avro-console-consumer --from-beginning --bootstrap-server SSL://10.1.0.1:9092,SSL://10.1.0.2:9092,SSL://10.1.0.3:9092 --property schema.registry.url=https://10.1.0.1:8081,https://10.1.0.2:8081,https://10.1.0.3:8081 --topic our_avro_encoded_topic --consumer-property security.protocol=SSL --consumer-property ssl.truststore.location=/path/to/client.truststore.jks --consumer-property ssl.truststore.password=pass1234 --consumer-property ssl.keystore.location=/path/to/client.keystore.jks --consumer-property ssl.keystore.password=pass1234 --consumer-property ssl.key.password=pass1234

I hope this helps someone else as well since I spent a lot of time trying to get the Avro clients to connect to a schema-registry only listening to TLS (while also doing client authentication).

rajesh....@gmail.com

unread,
Nov 27, 2018, 6:49:28 PM11/27/18
to Confluent Platform
Binoy,

You are my Kafka Connect God. I am new to Kafka and I was scratching my head for the last one week with this issue.. Today , it finally worked..

Thanks a lot for this post Binoy.

regards
Raj 

samprati sharma

unread,
May 2, 2020, 11:52:14 AM5/2/20
to Confluent Platform
Hi Binoy, 
I am doing setup on Kubernetes, So it s required to pass CA certs inside schemraegisry, connect pods
 
Reply all
Reply to author
Forward
0 new messages