Sql server connector topic issue

657 views
Skip to first unread message

Aqhil Mohammad

unread,
Oct 6, 2023, 7:39:37 AM10/6/23
to debezium
Hi All,

I'm facing issues while running connector, however sql connector is running fine, but kafka topics not able to pick up data. Getting below error.

[WARN] 2023-10-06 05:08:43,671 [kafka-producer-network-thread | connector-producer-cdna-sql-party-type-source-connector-0] org.apache.kafka.clients.NetworkClient handleSuccessfulResponse - [Producer clientId=connector-producer-cdna-sql-party-type-source-connector-0] Error while fetching metadata with correlation id 523463 : {stc-con-grp-2-AAFNFTE1-dbo-party_type=UNKNOWN_TOPIC_OR_PARTITION}

I have created raw topics and schema change topic as well. Still facing issue.

Chris Cranford

unread,
Oct 6, 2023, 8:55:25 AM10/6/23
to debe...@googlegroups.com
Hi,

This error can occur for a myriad of reasons.  Some of the most common is that the advertised listener configuration isn't set and is needed.  It could also be that you don't have automatic topic creation enabled and your broker requires ACL details that you aren't providing.  And finally, there could be invalid characters in your topic name, so I would ask if you could tell us precisely what is the topic names you created.  I would take a look at the Kafka broker logs, if you don't see any warnings or errors describing the problem, start with the advertised listener configuration first.

Thanks,
Chris
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/e9dd6323-9e3d-4b4f-ad79-ca9661728f79n%40googlegroups.com.

Aqhil Mohammad

unread,
Oct 9, 2023, 2:16:04 AM10/9/23
to debezium
Thanks Chris,

That issue is gone and connector started initial snapshot. After changing Database name in upper case. Earlier it was in lower case, but in my code DB name was in Upper case. Is it case -Sensitive ?  

topic name : stc-con-grp-2-AAFNFTE1-dbo-party_type
 
Now the issue is, after snapshot we are not getting In/Dl/Up into topic.

[INFO] 2023-10-06 15:25:20,665 [task-thread-cdna-sql-party-type-source-connector-0] org.apache.kafka.clients.consumer.internals.ConsumerCoordinator requestRejoin - [Consumer clientId=stc-connector-producer-cdna-sql-party-type-source-connector-0, groupId=stc.con.grp.2.schemahistory] Request joining group due to: consumer pro-actively leaving the group
[INFO] 2023-10-06 15:25:20,666 [task-thread-cdna-sql-party-type-source-connector-0] org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-06 15:25:20,666 [task-thread-cdna-sql-party-type-source-connector-0] org.apache.kafka.common.metrics.Metrics close - Closing reporter org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-06 15:25:20,666 [task-thread-cdna-sql-party-type-source-connector-0] org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-06 15:25:20,667 [debezium-sqlserverconnector-stc-con-grp-2-db-history-config-check] io.debezium.storage.kafka.history.KafkaSchemaHistory lambda$checkTopicSettings$0 - Attempted to validate database schema history topic but failed
java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TopicAuthorizationException: Topic authorization failed.
        at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
        at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2022)
        at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:180)
        at io.debezium.storage.kafka.history.KafkaSchemaHistory.lambda$checkTopicSettings$0(KafkaSchemaHistory.java:423)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.common.errors.TopicAuthorizationException: Topic authorization failed.
[INFO] 2023-10-06 15:25:20,669 [task-thread-cdna-sql-party-type-source-connector-0] org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info kafka.consumer for stc-connector-producer-cdna-sql-party-type-source-connector-0 unregistered

That's strange,

schemahistorytopic: "stc-sqlconnector-schemahistory-1"

Schema Change topic/Topic Preix : stc-con-grp-2

Raw  topic : stc-con-grp-2-AAFNFTE1-dbo-party_type

Does schema history topic need to have same topic prefix ?

Chris Cranford

unread,
Oct 9, 2023, 4:48:33 AM10/9/23
to debe...@googlegroups.com
Hi,

So the topic name is not prefixed in this case, it's taken as is from the configuration, so it's trying to access "stc-sqlconnector-schemahistory-1".  The issue is that the topic apparently exists; however the user you've configured for the schema history is not authorized to access that topic, see "Topic authorization failed".  Please share your full connector configuration please.

Thanks,
Chris

Aqhil Mohammad

unread,
Oct 9, 2023, 10:48:31 AM10/9/23
to debezium
Hello,

Here it is. After adding 'stc-' to consumer group in acls.yaml resolves issue. But connector not capturing CDC after initial snapshot. Being stuck for a week. I have added logs and configuration. 

connector_cdna_source.yaml :

spec:
  class:  io.debezium.connector.sqlserver.SqlServerConnector
  taskMax: {{ $v.taskMax }}
  connectClusterRef:
    name: sqlserver-connect-cluster
  configs:
    database.hostname: {{ $.Values.database_hostname }}
    database.port: "{{ $.Values.database_port }}"
    database.user: {{ $.Values.database_user }}
    database.password: {{ $.Values.sql_server_password }}
    database.names: {{ $.Values.cdna_sqlserver_database }}
    topic.prefix: {{ $v.topicPrefix }}
    topic.delimiter: "-"
    database.encrypt: "false"
    table.include.list: {{ $v.tablenames }}
    schema.history.internal.kafka.bootstrap.servers: {{ $.Values.broker_url }}
    schema.history.internal.kafka.topic: {{ $v.schemahistorytopic }}
    schema.history.internal.producer.security.protocol: SSL
    schema.history.internal.producer.ssl.key.password: {{ $.Values.keyStorePassword }}
    schema.history.internal.producer.ssl.keystore.location: /mnt/sslcerts/keystore.p12
    schema.history.internal.producer.ssl.keystore.password: {{ $.Values.keyStorePassword }}
    schema.history.internal.producer.ssl.truststore.location: /mnt/sslcerts/truststore.p12
    schema.history.internal.producer.ssl.truststore.password: {{ $.Values.trustStorePassword }}
    schema.history.internal.consumer.security.protocol: SSL
    schema.history.internal.consumer.ssl.key.password: {{ $.Values.keyStorePassword }}
    schema.history.internal.consumer.ssl.keystore.location: /mnt/sslcerts/keystore.p12
    schema.history.internal.consumer.ssl.keystore.password: {{ $.Values.keyStorePassword }}
    schema.history.internal.consumer.ssl.truststore.location: /mnt/sslcerts/truststore.p12
    schema.history.internal.consumer.ssl.truststore.password: {{ $.Values.trustStorePassword }}
    key.converter: "org.apache.kafka.connect.storage.StringConverter"
    value.converter: "org.apache.kafka.connect.json.JsonConverter"
    value.converter.schemas.enable: "false"
    key.converter.schemas.enable: "false"
    errors.tolerance: "all"
    errors.log.enable: "true"
    errors.log.include.messages: "true"
    errors.deadletterqueue.topic.name: {{ $v.deadletterqueueTopic }}
    errors.deadletterqueue.topic.replication.factor: "1"
    errors.deadletterqueue.context.headers.enable: "true"

connectors.yaml :

cdna_sqlserver_source_connectors_config:
  cdna-sql-party-source:
    tablenames: "dbo.party"
    taskMax: 1
    topicPrefix: "stc-con-grp-1"
    deadletterqueueTopic: "stc.dlq.connectorpoc.cdnadevstc"
    schemahistorytopic: "stc-sqlconnector-schemahistory"

  cdna-sql-party-type-source:
    tablenames: "dbo.party_type"
    taskMax: 1
    topicPrefix: "stc-con-grp-2"
    deadletterqueueTopic: "stc.dlq.connectorpoc.cdnadevstc"
    schemahistorytopic: "stc-sqlconnector-schemahistory-1"

logs:

[DEBUG] 2023-10-09 14:29:58,663 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[TRACE] 2023-10-09 14:29:58,750 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerConnection lambda$getMaxTransactionLsn$6 - Max transaction lsn is 00006b12:0000080f:0005
[TRACE] 2023-10-09 14:29:58,750 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerConnection lambda$getMaxTransactionLsn$6 - Max transaction lsn is 00006b12:0000080f:0005
[DEBUG] 2023-10-09 14:29:58,750 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource executeIteration - No change in the database
[DEBUG] 2023-10-09 14:29:58,750 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource executeIteration - No change in the database
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,163 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...
[DEBUG] 2023-10-09 14:29:59,164 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-09 14:29:59,164 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-09 14:29:59,164 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-09 14:29:59,164 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - no records available or batch size not reached yet, sleeping a bit...
[TRACE] 2023-10-09 14:29:59,250 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerConnection lambda$getMaxTransactionLsn$6 - Max transaction lsn is 00006b12:0000080f:0005
[TRACE] 2023-10-09 14:29:59,250 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerConnection lambda$getMaxTransactionLsn$6 - Max transaction lsn is 00006b12:0000080f:0005
[DEBUG] 2023-10-09 14:29:59,250 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource executeIteration - No change in the database
[DEBUG] 2023-10-09 14:29:59,250 [debezium-sqlserverconnector-stc-con-grp-2-change-event-source-coordinator] io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource executeIteration - No change in the database
[DEBUG] 2023-10-09 14:29:59,664 [task-thread-cdna-sql-party-type-source-connector-0] io.debezium.connector.base.ChangeEventQueue poll - checking for more records...

jiri.p...@gmail.com

unread,
Oct 10, 2023, 12:24:33 AM10/10/23
to debezium
Hi,

could you please add the complete log now?

Thanks

J.

Aqhil Mohammad

unread,
Oct 10, 2023, 8:12:10 AM10/10/23
to debezium
Hi,

Please find the full logs.

bash-5.1# kubectl logs -f sqlserver-connect-cluster-0 -n
sqlserver-connect-cluster
+ /mnt/config/connect/bin/run
===> User: uid=1001 gid=0(root) groups=0(root),1001
===> Loading connect operator scripts from path /mnt/config/connect/bin
===> Configure log4j config
===> Configure disk-usage agent
===> Configure jolokia agent
===> Configure jmx prometheus agent
===> Configure JVM config
===> Launching connect ...
Picked up JAVA_TOOL_OPTIONS: -Dhttps.proxyHost= -Dhttps.proxyPort=
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts=
[Global flags]
int AVX3Threshold = 4096
{ARCH diagnostic} {default}
bool AbortVMOnCompilationFailure = false
{diagnostic} {default}
ccstr AbortVMOnException =
{diagnostic} {default}
ccstr AbortVMOnExceptionMessage =
{diagnostic} {default}
bool AbortVMOnSafepointTimeout = false
{diagnostic} {default}
bool AbortVMOnVMOperationTimeout = false
{diagnostic} {default}
intx AbortVMOnVMOperationTimeoutDelay = 1000
{diagnostic} {default}
int ActiveProcessorCount = -1
{product} {default}
uintx AdaptiveSizeDecrementScaleFactor = 4
{product} {default}
uintx AdaptiveSizeMajorGCDecayTimeScale = 10
{product} {default}
uintx AdaptiveSizePolicyCollectionCostMargin = 50
{product} {default}
uintx AdaptiveSizePolicyInitializingSteps = 20
{product} {default}
uintx AdaptiveSizePolicyOutputInterval = 0
{product} {default}
uintx AdaptiveSizePolicyWeight = 10
{product} {default}
uintx AdaptiveSizeThroughPutPolicy = 0
{product} {default}
uintx AdaptiveTimeWeight = 25
{product} {default}
bool AggressiveHeap = false
{product} {default}
bool AggressiveOpts = false
{product} {default}
intx AliasLevel = 3
{C2 product} {default}
bool AlignVector = false
{C2 product} {default}
ccstr AllocateHeapAt =
{product} {default}
intx AllocateInstancePrefetchLines = 1
{product} {default}
intx AllocatePrefetchDistance = 192
{product} {default}
intx AllocatePrefetchInstr = 3
{product} {default}
intx AllocatePrefetchLines = 4
{product} {default}
intx AllocatePrefetchStepSize = 64
{product} {default}
intx AllocatePrefetchStyle = 1
{product} {default}
bool AllowJNIEnvProxy = false
{product} {default}
bool AllowNonVirtualCalls = false
{product} {default}
bool AllowParallelDefineClass = false
{product} {default}
bool AllowUserSignalHandlers = false
{product} {default}
bool AllowVectorizeOnDemand = true
{C2 product} {default}
bool AlwaysActAsServerClassMachine = false
{product} {default}
bool AlwaysCompileLoopMethods = false
{product} {default}
bool AlwaysLockClassLoader = false
{product} {default}
bool AlwaysPreTouch = false
{product} {default}
bool AlwaysRestoreFPU = false
{product} {default}
bool AlwaysTenure = false
{product} {default}
intx ArrayCopyLoadStoreMaxElem = 8
{C2 product} {default}
bool AssertOnSuspendWaitFailure = false
{product} {default}
bool AssumeMP = true
{product} {default}
intx AutoBoxCacheMax = 128
{C2 product} {default}
intx BCEATraceLevel = 0
{product} {default}
bool BackgroundCompilation = true
{pd product} {default}
size_t BaseFootPrintEstimate = 268435456
{product} {default}
intx BiasedLockingBulkRebiasThreshold = 20
{product} {default}
intx BiasedLockingBulkRevokeThreshold = 40
{product} {default}
intx BiasedLockingDecayTime = 25000
{product} {default}
intx BiasedLockingStartupDelay = 0
{product} {default}
bool BindCMSThreadToCPU = false
{diagnostic} {default}
bool BindGCTaskThreadsToCPUs = false
{product} {default}
bool BlockLayoutByFrequency = true
{C2 product} {default}
intx BlockLayoutMinDiamondPercentage = 20
{C2 product} {default}
bool BlockLayoutRotateLoops = true
{C2 product} {default}
bool BlockOffsetArrayUseUnallocatedBlock = false
{diagnostic} {default}
bool BranchOnRegister = false
{C2 product} {default}
bool BytecodeVerificationLocal = false
{product} {default}
bool BytecodeVerificationRemote = true
{product} {default}
bool C1OptimizeVirtualCallProfiling = true
{C1 product} {default}
bool C1ProfileBranches = true
{C1 product} {default}
bool C1ProfileCalls = true
{C1 product} {default}
bool C1ProfileCheckcasts = true
{C1 product} {default}
bool C1ProfileInlinedCalls = true
{C1 product} {default}
bool C1ProfileVirtualCalls = true
{C1 product} {default}
bool C1UpdateMethodData = true
{C1 product} {default}
intx CICompilerCount = 3
{product} {ergonomic}
bool CICompilerCountPerCPU = true
{product} {default}
bool CIPrintCompileQueue = false
{diagnostic} {default}
bool CITime = false
{product} {default}
bool CMSAbortSemantics = false
{product} {default}
uintx CMSAbortablePrecleanMinWorkPerIteration = 100
{product} {default}
intx CMSAbortablePrecleanWaitMillis = 100
{manageable} {default}
size_t CMSBitMapYieldQuantum = 10485760
{product} {default}
uintx CMSBootstrapOccupancy = 50
{product} {default}
bool CMSClassUnloadingEnabled = true
{product} {default}
uintx CMSClassUnloadingMaxInterval = 0
{product} {default}
bool CMSCleanOnEnter = true
{product} {default}
size_t CMSConcMarkMultiple = 32
{product} {default}
bool CMSConcurrentMTEnabled = true
{product} {default}
uintx CMSCoordinatorYieldSleepCount = 10
{product} {default}
bool CMSEdenChunksRecordAlways = true
{product} {default}
uintx CMSExpAvgFactor = 50
{product} {default}
bool CMSExtrapolateSweep = false
{product} {default}
uintx CMSIncrementalSafetyFactor = 10
{product} {default}
uintx CMSIndexedFreeListReplenish = 4
{product} {default}
intx CMSInitiatingOccupancyFraction = -1
{product} {default}
uintx CMSIsTooFullPercentage = 98
{product} {default}
double CMSLargeCoalSurplusPercent = 0.950000
{product} {default}
double CMSLargeSplitSurplusPercent = 1.000000
{product} {default}
bool CMSLoopWarn = false
{product} {default}
uintx CMSMaxAbortablePrecleanLoops = 0
{product} {default}
intx CMSMaxAbortablePrecleanTime = 5000
{product} {default}
size_t CMSOldPLABMax = 1024
{product} {default}
size_t CMSOldPLABMin = 16
{product} {default}
uintx CMSOldPLABNumRefills = 4
{product} {default}
uintx CMSOldPLABReactivityFactor = 2
{product} {default}
bool CMSOldPLABResizeQuicker = false
{product} {default}
uintx CMSOldPLABToleranceFactor = 4
{product} {default}
bool CMSPLABRecordAlways = true
{product} {default}
bool CMSParallelInitialMarkEnabled = true
{product} {default}
bool CMSParallelRemarkEnabled = true
{product} {default}
bool CMSParallelSurvivorRemarkEnabled = true
{product} {default}
uintx CMSPrecleanDenominator = 3
{product} {default}
uintx CMSPrecleanIter = 3
{product} {default}
uintx CMSPrecleanNumerator = 2
{product} {default}
bool CMSPrecleanRefLists1 = true
{product} {default}
bool CMSPrecleanRefLists2 = false
{product} {default}
bool CMSPrecleanSurvivors1 = false
{product} {default}
bool CMSPrecleanSurvivors2 = true
{product} {default}
uintx CMSPrecleanThreshold = 1000
{product} {default}
bool CMSPrecleaningEnabled = true
{product} {default}
bool CMSPrintChunksInDump = false
{product} {default}
bool CMSPrintObjectsInDump = false
{product} {default}
uintx CMSRemarkVerifyVariant = 1
{product} {default}
bool CMSReplenishIntermediate = true
{product} {default}
size_t CMSRescanMultiple = 32
{product} {default}
uintx CMSSamplingGrain = 16384
{product} {default}
bool CMSScavengeBeforeRemark = false
{product} {default}
uintx CMSScheduleRemarkEdenPenetration = 50
{product} {default}
size_t CMSScheduleRemarkEdenSizeThreshold = 2097152
{product} {default}
uintx CMSScheduleRemarkSamplingRatio = 5
{product} {default}
double CMSSmallCoalSurplusPercent = 1.050000
{product} {default}
double CMSSmallSplitSurplusPercent = 1.100000
{product} {default}
bool CMSSplitIndexedFreeListBlocks = true
{product} {default}
intx CMSTriggerInterval = -1
{manageable} {default}
uintx CMSTriggerRatio = 80
{product} {default}
intx CMSWaitDuration = 2000
{manageable} {default}
uintx CMSWorkQueueDrainThreshold = 10
{product} {default}
bool CMSYield = true
{product} {default}
uintx CMSYieldSleepCount = 0
{product} {default}
size_t CMSYoungGenPerWorker = 67108864
{pd product} {default}
uintx CMS_FLSPadding = 1
{product} {default}
uintx CMS_FLSWeight = 75
{product} {default}
uintx CMS_SweepPadding = 1
{product} {default}
uintx CMS_SweepTimerThresholdMillis = 10
{product} {default}
uintx CMS_SweepWeight = 75
{product} {default}
uintx CPUForCMSThread = 0
{diagnostic} {default}
bool CalculateClassFingerprint = false
{product} {default}
bool CheckIntrinsics = true
{diagnostic} {default}
bool CheckJNICalls = false
{product} {default}
bool ClassUnloading = true
{product} {default}
bool ClassUnloadingWithConcurrentMark = true
{product} {default}
bool ClipInlining = true
{product} {default}
uintx CodeCacheExpansionSize = 65536
{pd product} {default}
uintx CodeCacheMinBlockLength = 6
{pd diagnostic} {default}
bool CompactFields = true
{product} {default}
bool CompactStrings = true
{pd product} {default}
intx CompilationPolicyChoice = 2
{product} {default}
ccstrlist CompileCommand =
{product} {default}
ccstr CompileCommandFile =
{product} {default}
ccstrlist CompileOnly =
{product} {default}
intx CompileThreshold = 10000
{pd product} {default}
double CompileThresholdScaling = 1.000000
{product} {default}
ccstr CompilerDirectivesFile =
{diagnostic} {default}
bool CompilerDirectivesIgnoreCompileCommands = false
{diagnostic} {default}
int CompilerDirectivesLimit = 50
{diagnostic} {default}
bool CompilerDirectivesPrint = false
{diagnostic} {default}
bool CompilerThreadHintNoPreempt = false
{product} {default}
intx CompilerThreadPriority = -1
{product} {default}
intx CompilerThreadStackSize = 1024
{pd product} {default}
size_t CompressedClassSpaceSize = 1073741824
{product} {default}
uint ConcGCThreads = 1
{product} {command line}
intx ConditionalMoveLimit = 3
{C2 pd product} {default}
intx ContendedPaddingWidth = 128
{product} {default}
bool CrashOnOutOfMemoryError = false
{product} {default}
bool CreateCoredumpOnCrash = true
{product} {default}
bool CriticalJNINatives = true
{product} {default}
bool DTraceAllocProbes = false
{product} {default}
bool DTraceMethodProbes = false
{product} {default}
bool DTraceMonitorProbes = false
{product} {default}
bool DebugInlinedCalls = true
{C2 diagnostic} {default}
bool DebugNonSafepoints = false
{diagnostic} {default}
bool Debugging = false
{product} {default}
bool DeferInitialCardMark = false
{diagnostic} {default}
bool DeoptimizeRandom = false
{product} {default}
bool DisableAttachMechanism = false
{product} {default}
bool DisableExplicitGC = false
{product} {default}
ccstrlist DisableIntrinsic =
{diagnostic} {default}
bool DisplayVMOutput = true
{diagnostic} {default}
bool DisplayVMOutputToStderr = false
{product} {default}
bool DisplayVMOutputToStdout = false
{product} {default}
bool DoEscapeAnalysis = true
{C2 product} {default}
bool DoReserveCopyInSuperWord = true
{C2 product} {default}
intx DominatorSearchLimit = 1000
{C2 diagnostic} {default}
bool DontCompileHugeMethods = true
{product} {default}
bool DontYieldALot = false
{pd product} {default}
ccstr DumpLoadedClassList =
{product} {default}
bool DumpPrivateMappingsInCore = true
{diagnostic} {default}
bool DumpReplayDataOnError = true
{product} {default}
bool DumpSharedMappingsInCore = true
{diagnostic} {default}
bool DumpSharedSpaces = false
{product} {default}
bool DynamicallyResizeSystemDictionaries = true
{diagnostic} {default}
bool EagerXrunInit = false
{product} {default}
intx EliminateAllocationArraySizeLimit = 64
{C2 product} {default}
bool EliminateAllocations = true
{C2 product} {default}
bool EliminateAutoBox = true
{C2 product} {default}
bool EliminateLocks = true
{C2 product} {default}
bool EliminateNestedLocks = true
{C2 product} {default}
bool EnableContended = true
{product} {default}
bool EnableDynamicAgentLoading = true
{product} {default}
bool EnableThreadSMRExtraValidityChecks = true
{diagnostic} {default}
bool EnableThreadSMRStatistics = false
{diagnostic} {default}
size_t ErgoHeapSizeLimit = 0
{product} {default}
ccstr ErrorFile =
{product} {default}
bool ErrorFileToStderr = false
{product} {default}
bool ErrorFileToStdout = false
{product} {default}
uint64_t ErrorLogTimeout = 120
{product} {default}
ccstr ErrorReportServer =
{product} {default}
double EscapeAnalysisTimeout = 20.000000
{C2 product} {default}
bool EstimateArgEscape = true
{product} {default}
bool ExecutingUnitTests = false
{product} {default}
bool ExitOnOutOfMemoryError = false
{product} {default}
bool ExplicitGCInvokesConcurrent = true
{product} {command line}
bool ExtendedDTraceProbes = false
{product} {default}
bool ExtensiveErrorReports = false
{product} {default}
ccstr ExtraSharedClassListFile =
{product} {default}
bool FLSAlwaysCoalesceLarge = false
{product} {default}
uintx FLSCoalescePolicy = 2
{product} {default}
double FLSLargestBlockCoalesceProximity = 0.990000
{product} {default}
bool FLSVerifyAllHeapReferences = false
{diagnostic} {default}
bool FLSVerifyIndexTable = false
{diagnostic} {default}
bool FLSVerifyLists = false
{diagnostic} {default}
bool FailOverToOldVerifier = true
{product} {default}
intx FieldsAllocationStyle = 1
{product} {default}
bool FilterSpuriousWakeups = true
{product} {default}
bool FlightRecorder = false
{product} {default}
ccstr FlightRecorderOptions =
{product} {default}
bool FoldStableValues = true
{diagnostic} {default}
bool ForceDynamicNumberOfGCThreads = false
{diagnostic} {default}
bool ForceNUMA = false
{product} {default}
bool ForceTimeHighResolution = false
{product} {default}
bool ForceUnreachable = false
{diagnostic} {default}
intx FreqInlineSize = 325
{pd product} {default}
double G1ConcMarkStepDurationMillis = 10.000000
{product} {default}
uintx G1ConcRSHotCardLimit = 4
{product} {default}
size_t G1ConcRSLogCacheSize = 10
{product} {default}
size_t G1ConcRefinementGreenZone = 0
{product} {default}
size_t G1ConcRefinementRedZone = 0
{product} {default}
uintx G1ConcRefinementServiceIntervalMillis = 300
{product} {default}
uint G1ConcRefinementThreads = 1
{product} {ergonomic}
size_t G1ConcRefinementThresholdStep = 2
{product} {default}
size_t G1ConcRefinementYellowZone = 0
{product} {default}
uintx G1ConfidencePercent = 50
{product} {default}
size_t G1HeapRegionSize = 16777216
{product} {command line}
uintx G1HeapWastePercent = 5
{product} {default}
uintx G1MixedGCCountTarget = 8
{product} {default}
intx G1RSetRegionEntries = 1280
{product} {default}
size_t G1RSetScanBlockSize = 64
{product} {default}
intx G1RSetSparseRegionEntries = 20
{product} {default}
intx G1RSetUpdatingPauseTimePercent = 10
{product} {default}
uint G1RefProcDrainInterval = 1000
{product} {default}
uintx G1ReservePercent = 10
{product} {default}
uintx G1SATBBufferEnqueueingThresholdPercent = 60
{product} {default}
size_t G1SATBBufferSize = 1024
{product} {default}
intx G1SummarizeRSetStatsPeriod = 0
{diagnostic} {default}
size_t G1UpdateBufferSize = 256
{product} {default}
bool G1UseAdaptiveConcRefinement = true
{product} {default}
bool G1UseAdaptiveIHOP = true
{product} {default}
bool G1VerifyHeapRegionCodeRoots = false
{diagnostic} {default}
bool G1VerifyRSetsDuringFullGC = false
{diagnostic} {default}
uintx GCDrainStackTargetSize = 64
{product} {ergonomic}
uintx GCHeapFreeLimit = 2
{product} {default}
uintx GCLockerEdenExpansionPercent = 5
{product} {default}
bool GCLockerInvokesConcurrent = false
{product} {default}
uintx GCLockerRetryAllocationCount = 2
{diagnostic} {default}
bool GCParallelVerificationEnabled = true
{diagnostic} {default}
uintx GCPauseIntervalMillis = 21
{product} {default}
uint GCTaskTimeStampEntries = 200
{product} {default}
uintx GCTimeLimit = 98
{product} {default}
uintx GCTimeRatio = 12
{product} {default}
intx GuaranteedSafepointInterval = 1000
{diagnostic} {default}
uint HandshakeTimeout = 0
{diagnostic} {default}
size_t HeapBaseMinAddress = 2147483648
{pd product} {default}
bool HeapDumpAfterFullGC = false
{manageable} {default}
bool HeapDumpBeforeFullGC = false
{manageable} {default}
bool HeapDumpOnOutOfMemoryError = false
{manageable} {default}
ccstr HeapDumpPath =
{manageable} {default}
uintx HeapFirstMaximumCompactionCount = 3
{product} {default}
uintx HeapMaximumCompactionInterval = 20
{product} {default}
uintx HeapSearchSteps = 3
{product} {default}
size_t HeapSizePerGCThread = 43620760
{product} {default}
intx HotMethodDetectionLimit = 100000
{diagnostic} {default}
bool IdealizeClearArrayNode = true
{C2 pd diagnostic} {default}
bool IgnoreEmptyClassPaths = false
{product} {default}
bool IgnoreUnrecognizedVMOptions = false
{product} {default}
bool IgnoreUnverifiableClassesDuringDump = true
{diagnostic} {default}
bool ImplicitNullChecks = true
{pd diagnostic} {default}
uintx IncreaseFirstTierCompileThresholdAt = 50
{product} {default}
bool IncrementalInline = true
{C2 product} {default}
intx InitArrayShortSize = 64
{pd diagnostic} {default}
size_t InitialBootClassLoaderMetaspaceSize = 4194304
{product} {default}
uintx InitialCodeCacheSize = 2555904
{pd product} {default}
size_t InitialHeapSize = 1073741824
{product} {command line}
uintx InitialRAMFraction = 64
{product} {default}
double InitialRAMPercentage = 1.562500
{product} {default}
uintx InitialSurvivorRatio = 8
{product} {default}
uintx InitialTenuringThreshold = 7
{product} {default}
uintx InitiatingHeapOccupancyPercent = 35
{product} {command line}
bool InjectGCWorkerCreationFailure = false
{diagnostic} {default}
bool Inline = true
{product} {default}
bool InlineArrayCopy = true
{diagnostic} {default}
bool InlineClassNatives = true
{diagnostic} {default}
ccstr InlineDataFile =
{product} {default}
intx InlineFrequencyCount = 100
{pd diagnostic} {default}
bool InlineMathNatives = true
{diagnostic} {default}
bool InlineNIOCheckIndex = true
{C1 diagnostic} {default}
bool InlineNatives = true
{diagnostic} {default}
bool InlineObjectCopy = true
{C2 diagnostic} {default}
bool InlineObjectHash = true
{diagnostic} {default}
bool InlineReflectionGetCallerClass = true
{C2 diagnostic} {default}
intx InlineSmallCode = 2000
{pd product} {default}
bool InlineSynchronizedMethods = true
{C1 product} {default}
bool InlineThreadNatives = true
{diagnostic} {default}
bool InlineUnsafeOps = true
{diagnostic} {default}
bool InsertMemBarAfterArraycopy = true
{C2 product} {default}
intx InteriorEntryAlignment = 16
{C2 pd product} {default}
intx InterpreterProfilePercentage = 33
{product} {default}
intx JVMTIInterpreterNameMode = 0
{product} {default}
bool JavaMonitorsInStackTrace = true
{product} {default}
intx JavaPriority10_To_OSPriority = -1
{product} {default}
intx JavaPriority1_To_OSPriority = -1
{product} {default}
intx JavaPriority2_To_OSPriority = -1
{product} {default}
intx JavaPriority3_To_OSPriority = -1
{product} {default}
intx JavaPriority4_To_OSPriority = -1
{product} {default}
intx JavaPriority5_To_OSPriority = -1
{product} {default}
intx JavaPriority6_To_OSPriority = -1
{product} {default}
intx JavaPriority7_To_OSPriority = -1
{product} {default}
intx JavaPriority8_To_OSPriority = -1
{product} {default}
intx JavaPriority9_To_OSPriority = -1
{product} {default}
bool LIRFillDelaySlots = false
{C1 pd product} {default}
size_t LargePageHeapSizeThreshold = 134217728
{product} {default}
size_t LargePageSizeInBytes = 0
{product} {default}
intx LiveNodeCountInliningCutoff = 40000
{C2 product} {default}
bool LoadExecStackDllInVMThread = true
{product} {default}
bool LogCompilation = false
{diagnostic} {default}
bool LogEvents = true
{diagnostic} {default}
uintx LogEventsBufferEntries = 20
{diagnostic} {default}
ccstr LogFile =
{diagnostic} {default}
bool LogTouchedMethods = false
{diagnostic} {default}
bool LogVMOutput = false
{diagnostic} {default}
intx LoopMaxUnroll = 16
{C2 product} {default}
intx LoopOptsCount = 43
{C2 product} {default}
intx LoopPercentProfileLimit = 10
{C2 pd product} {default}
uintx LoopStripMiningIter = 1000
{C2 product} {default}
uintx LoopStripMiningIterShortLoop = 100
{C2 product} {default}
intx LoopUnrollLimit = 60
{C2 pd product} {default}
intx LoopUnrollMin = 4
{C2 product} {default}
bool LoopUnswitching = true
{C2 product} {default}
uintx MallocMaxTestWords = 0
{diagnostic} {default}
bool ManagementServer = true
{product} {command line}
size_t MarkStackSize = 4194304
{product} {ergonomic}
size_t MarkStackSizeMax = 16777216
{product} {default}
uint MarkSweepAlwaysCompactCount = 4
{product} {default}
uintx MarkSweepDeadRatio = 5
{product} {default}
intx MaxBCEAEstimateLevel = 5
{product} {default}
intx MaxBCEAEstimateSize = 150
{product} {default}
uint64_t MaxDirectMemorySize = 0
{product} {default}
bool MaxFDLimit = true
{product} {default}
uintx MaxGCMinorPauseMillis = 18446744073709551615
{product} {default}
uintx MaxGCPauseMillis = 20
{product} {command line}
uintx MaxHeapFreeRatio = 70
{manageable} {default}
size_t MaxHeapSize = 1073741824
{product} {command line}
intx MaxInlineLevel = 15
{product} {default}
intx MaxInlineSize = 35
{product} {default}
intx MaxJNILocalCapacity = 65536
{product} {default}
intx MaxJavaStackTraceDepth = 1024
{product} {default}
intx MaxJumpTableSize = 65000
{C2 product} {default}
intx MaxJumpTableSparseness = 5
{C2 product} {default}
intx MaxLabelRootDepth = 1100
{C2 product} {default}
intx MaxLoopPad = 11
{C2 product} {default}
size_t MaxMetaspaceExpansion = 5451776
{product} {default}
uintx MaxMetaspaceFreeRatio = 80
{product} {command line}
size_t MaxMetaspaceSize = 18446744073709547520
{product} {default}
size_t MaxNewSize = 637534208
{product} {ergonomic}
intx MaxNodeLimit = 80000
{C2 product} {default}
uint64_t MaxRAM = 137438953472
{pd product} {default}
uintx MaxRAMFraction = 4
{product} {default}
double MaxRAMPercentage = 25.000000
{product} {default}
intx MaxRecursiveInlineLevel = 1
{product} {default}
uintx MaxTenuringThreshold = 15
{product} {default}
intx MaxTrivialSize = 6
{product} {default}
intx MaxVectorSize = 64
{C2 product} {default}
size_t MetaspaceSize = 100663296
{pd product} {command line}
bool MethodFlushing = true
{product} {default}
size_t MinHeapDeltaBytes = 16777216
{product} {ergonomic}
uintx MinHeapFreeRatio = 40
{manageable} {default}
intx MinInliningThreshold = 250
{product} {default}
intx MinJumpTableSize = 10
{C2 pd product} {default}
size_t MinMetaspaceExpansion = 339968
{product} {default}
uintx MinMetaspaceFreeRatio = 50
{product} {command line}
intx MinPassesBeforeFlush = 10
{diagnostic} {default}
uintx MinRAMFraction = 2
{product} {default}
double MinRAMPercentage = 50.000000
{product} {default}
uintx MinSurvivorRatio = 3
{product} {default}
size_t MinTLABSize = 2048
{product} {default}
intx MonitorBound = 0
{product} {default}
bool MonitorInUseLists = true
{product} {default}
intx MultiArrayExpandLimit = 6
{C2 product} {default}
uintx NUMAChunkResizeWeight = 20
{product} {default}
size_t NUMAInterleaveGranularity = 2097152
{product} {default}
uintx NUMAPageScanRate = 256
{product} {default}
size_t NUMASpaceResizeRate = 1073741824
{product} {default}
bool NUMAStats = false
{product} {default}
ccstr NativeMemoryTracking = off
{product} {default}
bool NeedsDeoptSuspend = false
{pd product} {default}
bool NeverActAsServerClassMachine = false
{pd product} {default}
bool NeverTenure = false
{product} {default}
uintx NewRatio = 2
{product} {default}
size_t NewSize = 1363144
{product} {default}
size_t NewSizeThreadIncrease = 5320
{pd product} {default}
intx NmethodSweepActivity = 10
{product} {default}
intx NodeLimitFudgeFactor = 2000
{C2 product} {default}
uintx NonNMethodCodeHeapSize = 5830732
{pd product} {ergonomic}
uintx NonProfiledCodeHeapSize = 122913754
{pd product} {ergonomic}
intx NumberOfLoopInstrToAlign = 4
{C2 product} {default}
intx ObjectAlignmentInBytes = 8
{lp64_product} {default}
size_t OldPLABSize = 1024
{product} {default}
uintx OldPLABWeight = 50
{product} {default}
size_t OldSize = 5452592
{product} {default}
bool OmitStackTraceInFastThrow = true
{product} {default}
ccstrlist OnError =
{product} {default}
ccstrlist OnOutOfMemoryError =
{product} {default}
intx OnStackReplacePercentage = 140
{pd product} {default}
bool OptimizeExpensiveOps = true
{C2 diagnostic} {default}
bool OptimizeFill = true
{C2 product} {default}
bool OptimizePtrCompare = true
{C2 product} {default}
bool OptimizeStringConcat = true
{C2 product} {default}
bool OptoBundling = false
{C2 pd product} {default}
intx OptoLoopAlignment = 16
{pd product} {default}
bool OptoRegScheduling = true
{C2 pd product} {default}
bool OptoScheduling = false
{C2 pd product} {default}
bool OverrideVMProperties = false
{product} {default}
uintx PLABWeight = 75
{product} {default}
bool PSChunkLargeArrays = true
{product} {default}
int ParGCArrayScanChunk = 50
{product} {default}
intx ParGCCardsPerStrideChunk = 256
{diagnostic} {default}
uintx ParGCDesiredObjsFromOverflowList = 20
{product} {default}
uintx ParGCStridesPerThread = 2
{diagnostic} {default}
bool ParGCTrimOverflow = true
{product} {default}
bool ParGCUseLocalOverflow = false
{product} {default}
uintx ParallelGCBufferWastePct = 10
{product} {default}
uint ParallelGCThreads = 1
{product} {command line}
size_t ParallelOldDeadWoodLimiterMean = 50
{product} {default}
size_t ParallelOldDeadWoodLimiterStdDev = 80
{product} {default}
bool ParallelRefProcBalancingEnabled = true
{product} {default}
bool ParallelRefProcEnabled = false
{product} {default}
bool PartialPeelAtUnsignedTests = true
{C2 product} {default}
bool PartialPeelLoop = true
{C2 product} {default}
intx PartialPeelNewPhiDelta = 0
{C2 product} {default}
bool PauseAtExit = false
{diagnostic} {default}
bool PauseAtStartup = false
{diagnostic} {default}
ccstr PauseAtStartupFile =
{diagnostic} {default}
uintx PausePadding = 1
{product} {default}
intx PerBytecodeRecompilationCutoff = 200
{product} {default}
intx PerBytecodeTrapLimit = 4
{product} {default}
intx PerMethodRecompilationCutoff = 400
{product} {default}
intx PerMethodTrapLimit = 100
{product} {default}
bool PerfAllowAtExitRegistration = false
{product} {default}
bool PerfBypassFileSystemCheck = false
{product} {default}
intx PerfDataMemorySize = 32768
{product} {default}
intx PerfDataSamplingInterval = 50
{product} {default}
ccstr PerfDataSaveFile =
{product} {default}
bool PerfDataSaveToFile = false
{product} {default}
bool PerfDisableSharedMem = true
{product} {default}
intx PerfMaxStringConstLength = 1024
{product} {default}
size_t PreTouchParallelChunkSize = 1073741824
{product} {default}
bool PreferContainerQuotaForCPUCount = true
{product} {default}
bool PreferInterpreterNativeStubs = false
{pd product} {default}
intx PrefetchCopyIntervalInBytes = 576
{product} {default}
intx PrefetchFieldsAhead = 1
{product} {default}
intx PrefetchScanIntervalInBytes = 576
{product} {default}
bool PreserveAllAnnotations = false
{product} {default}
bool PreserveFramePointer = false
{pd product} {default}
size_t PretenureSizeThreshold = 0
{product} {default}
bool PrintAdapterHandlers = false
{diagnostic} {default}
bool PrintAssembly = false
{diagnostic} {default}
ccstr PrintAssemblyOptions =
{diagnostic} {default}
bool PrintBiasedLockingStatistics = false
{diagnostic} {default}
bool PrintClassHistogram = false
{manageable} {default}
bool PrintCodeCache = false
{product} {default}
bool PrintCodeCacheOnCompilation = false
{product} {default}
bool PrintCommandLineFlags = false
{product} {default}
bool PrintCompilation = false
{product} {default}
bool PrintCompilation2 = false
{diagnostic} {default}
bool PrintConcurrentLocks = false
{manageable} {default}
bool PrintExtendedThreadInfo = false
{product} {default}
bool PrintFlagsFinal = true
{product} {command line}
bool PrintFlagsInitial = false
{product} {default}
bool PrintFlagsRanges = false
{product} {default}
bool PrintGC = false
{product} {default}
bool PrintGCDetails = false
{product} {default}
bool PrintHeapAtSIGBREAK = true
{product} {default}
bool PrintInlining = false
{diagnostic} {default}
bool PrintInterpreter = false
{diagnostic} {default}
bool PrintIntrinsics = false
{C2 diagnostic} {default}
bool PrintJNIResolving = false
{product} {default}
bool PrintMetaspaceStatisticsAtExit = false
{diagnostic} {default}
bool PrintMethodData = false
{diagnostic} {default}
bool PrintMethodFlushingStatistics = false
{diagnostic} {default}
bool PrintMethodHandleStubs = false
{diagnostic} {default}
bool PrintNMTStatistics = false
{diagnostic} {default}
bool PrintNMethods = false
{diagnostic} {default}
bool PrintNativeNMethods = false
{diagnostic} {default}
bool PrintOptoAssembly = false
{C2 diagnostic} {default}
bool PrintPreciseBiasedLockingStatistics = false
{C2 diagnostic} {default}
bool PrintPreciseRTMLockingStatistics = false
{C2 diagnostic} {default}
bool PrintSafepointStatistics = false
{product} {default}
intx PrintSafepointStatisticsCount = 300
{product} {default}
intx PrintSafepointStatisticsTimeout = -1
{product} {default}
bool PrintSharedArchiveAndExit = false
{product} {default}
bool PrintSharedDictionary = false
{product} {default}
bool PrintSignatureHandlers = false
{diagnostic} {default}
bool PrintStringTableStatistics = false
{product} {default}
bool PrintStubCode = false
{diagnostic} {default}
bool PrintTieredEvents = false
{product} {default}
bool PrintTouchedMethodsAtExit = false
{diagnostic} {default}
bool PrintVMOptions = false
{product} {default}
bool PrintVMQWaitTime = false
{product} {default}
bool PrintWarnings = true
{product} {default}
uintx ProcessDistributionStride = 4
{product} {default}
bool ProfileDynamicTypes = true
{C2 diagnostic} {default}
bool ProfileInterpreter = true
{pd product} {default}
bool ProfileIntervals = false
{product} {default}
intx ProfileIntervalsTicks = 100
{product} {default}
intx ProfileMaturityPercentage = 20
{product} {default}
bool ProfileVM = false
{product} {default}
uintx ProfiledCodeHeapSize = 122913754
{pd product} {ergonomic}
intx ProfilerNumberOfCompiledMethods = 25
{diagnostic} {default}
intx ProfilerNumberOfInterpretedMethods = 25
{diagnostic} {default}
intx ProfilerNumberOfRuntimeStubNodes = 25
{diagnostic} {default}
intx ProfilerNumberOfStubMethods = 25
{diagnostic} {default}
bool ProfilerPrintByteCodeStatistics = false
{product} {default}
bool ProfilerRecordPC = false
{product} {default}
uintx PromotedPadding = 3
{product} {default}
uintx QueuedAllocationWarningCount = 0
{product} {default}
int RTMRetryCount = 5
{ARCH product} {default}
bool RangeCheckElimination = true
{product} {default}
bool ReassociateInvariants = true
{C2 product} {default}
bool ReduceBulkZeroing = true
{C2 product} {default}
bool ReduceFieldZeroing = true
{C2 product} {default}
bool ReduceInitialCardMarks = true
{C2 product} {default}
bool ReduceNumberOfCompilerThreads = true
{diagnostic} {default}
bool ReduceSignalUsage = false
{product} {default}
intx RefDiscoveryPolicy = 0
{product} {default}
bool RegisterFinalizersAtInit = true
{product} {default}
bool RelaxAccessControlCheck = false
{product} {default}
ccstr ReplayDataFile =
{product} {default}
bool RequireSharedSpaces = false
{product} {default}
uintx ReservedCodeCacheSize = 251658240
{pd product} {ergonomic}
bool ResizeOldPLAB = true
{product} {default}
bool ResizePLAB = true
{product} {default}
bool ResizeTLAB = true
{pd product} {default}
bool RestoreMXCSROnJNICalls = false
{product} {default}
bool RestrictContended = true
{product} {default}
bool RestrictReservedStack = true
{product} {default}
bool RewriteBytecodes = true
{pd product} {default}
bool RewriteFrequentPairs = true
{pd product} {default}
bool SafepointALot = false
{diagnostic} {default}
bool SafepointTimeout = false
{product} {default}
intx SafepointTimeoutDelay = 10000
{product} {default}
bool ScavengeBeforeFullGC = false
{product} {default}
intx ScavengeRootsInCode = 2
{diagnostic} {default}
bool SegmentedCodeCache = true
{product} {ergonomic}
intx SelfDestructTimer = 0
{product} {default}
bool SerializeVMOutput = true
{diagnostic} {default}
ccstr SharedArchiveConfigFile =
{product} {default}
ccstr SharedArchiveFile =
{product} {default}
size_t SharedBaseAddress = 34359738368
{product} {default}
ccstr SharedClassListFile =
{product} {default}
uintx SharedSymbolTableBucketSize = 4
{product} {default}
bool ShenandoahAllocFailureALot = false
{diagnostic} {default}
bool ShenandoahCASBarrier = true
{diagnostic} {default}
bool ShenandoahCloneBarrier = true
{diagnostic} {default}
uintx ShenandoahCodeRootsStyle = 2
{diagnostic} {default}
bool ShenandoahDegeneratedGC = true
{diagnostic} {default}
bool ShenandoahElasticTLAB = true
{diagnostic} {default}
ccstr ShenandoahGCHeuristics = adaptive
{product} {default}
ccstr ShenandoahGCMode = satb
{product} {default}
bool ShenandoahHumongousMoves = true
{diagnostic} {default}
bool ShenandoahIUBarrier = false
{diagnostic} {default}
bool ShenandoahLoadRefBarrier = true
{diagnostic} {default}
bool ShenandoahLoopOptsAfterExpansion = true
{diagnostic} {default}
bool ShenandoahOOMDuringEvacALot = false
{diagnostic} {default}
bool ShenandoahOptimizeStaticFinals = true
{diagnostic} {default}
bool ShenandoahPreclean = true
{diagnostic} {default}
bool ShenandoahSATBBarrier = true
{diagnostic} {default}
bool ShenandoahSelfFixing = true
{diagnostic} {default}
size_t ShenandoahSoftMaxHeapSize = 0
{manageable} {default}
bool ShenandoahVerify = false
{diagnostic} {default}
intx ShenandoahVerifyLevel = 4
{diagnostic} {default}
bool ShowHiddenFrames = false
{diagnostic} {default}
bool ShowMessageBoxOnError = false
{product} {default}
bool ShowRegistersOnAssert = false
{diagnostic} {default}
bool ShrinkHeapInSteps = true
{product} {default}
intx SoftRefLRUPolicyMSPerMB = 1000
{product} {default}
bool SpecialArraysEquals = true
{C2 diagnostic} {default}
bool SpecialEncodeISOArray = true
{C2 diagnostic} {default}
bool SpecialStringCompareTo = true
{C2 diagnostic} {default}
bool SpecialStringEquals = true
{C2 diagnostic} {default}
bool SpecialStringIndexOf = true
{C2 diagnostic} {default}
bool SplitIfBlocks = true
{C2 product} {default}
intx StackRedPages = 1
{pd product} {default}
intx StackReservedPages = 1
{pd product} {default}
intx StackShadowPages = 20
{pd product} {default}
bool StackTraceInThrowable = true
{product} {default}
intx StackYellowPages = 2
{pd product} {default}
uintx StartAggressiveSweepingAt = 10
{product} {default}
bool StartAttachListener = false
{product} {default}
ccstr StartFlightRecording =
{product} {default}
bool StressCodeAging = false
{diagnostic} {default}
bool StressGCM = false
{C2 diagnostic} {default}
bool StressLCM = false
{C2 diagnostic} {default}
bool StressLdcRewrite = false
{product} {default}
uintx StringDeduplicationAgeThreshold = 3
{product} {default}
bool StringDeduplicationRehashALot = false
{diagnostic} {default}
bool StringDeduplicationResizeALot = false
{diagnostic} {default}
uintx StringTableSize = 65536
{product} {default}
bool SuperWordLoopUnrollAnalysis = true
{C2 pd product} {default}
bool SuperWordReductions = true
{C2 product} {default}
bool SuppressFatalErrorMessage = false
{product} {default}
uintx SurvivorPadding = 3
{product} {default}
uintx SurvivorRatio = 8
{product} {default}
intx SuspendRetryCount = 50
{product} {default}
intx SuspendRetryDelay = 5
{product} {default}
uintx TLABAllocationWeight = 35
{product} {default}
uintx TLABRefillWasteFraction = 64
{product} {default}
size_t TLABSize = 0
{product} {default}
bool TLABStats = true
{product} {default}
uintx TLABWasteIncrement = 4
{product} {default}
uintx TLABWasteTargetPercent = 1
{product} {default}
uintx TargetPLABWastePct = 10
{product} {default}
uintx TargetSurvivorRatio = 50
{product} {default}
uintx TenuredGenerationSizeIncrement = 20
{product} {default}
uintx TenuredGenerationSizeSupplement = 80
{product} {default}
uintx TenuredGenerationSizeSupplementDecay = 2
{product} {default}
bool ThreadLocalHandshakes = true
{pd product} {default}
intx ThreadPriorityPolicy = 0
{product} {default}
bool ThreadPriorityVerbose = false
{product} {default}
intx ThreadStackSize = 1024
{pd product} {default}
uintx ThresholdTolerance = 10
{product} {default}
intx Tier0BackedgeNotifyFreqLog = 10
{product} {default}
intx Tier0InvokeNotifyFreqLog = 7
{product} {default}
intx Tier0ProfilingStartPercentage = 200
{product} {default}
intx Tier23InlineeNotifyFreqLog = 20
{product} {default}
intx Tier2BackEdgeThreshold = 0
{product} {default}
intx Tier2BackedgeNotifyFreqLog = 14
{product} {default}
intx Tier2CompileThreshold = 0
{product} {default}
intx Tier2InvokeNotifyFreqLog = 11
{product} {default}
intx Tier3AOTBackEdgeThreshold = 120000
{product} {default}
intx Tier3AOTCompileThreshold = 15000
{product} {default}
intx Tier3AOTInvocationThreshold = 10000
{product} {default}
intx Tier3AOTMinInvocationThreshold = 1000
{product} {default}
intx Tier3BackEdgeThreshold = 60000
{product} {default}
intx Tier3BackedgeNotifyFreqLog = 13
{product} {default}
intx Tier3CompileThreshold = 2000
{product} {default}
intx Tier3DelayOff = 2
{product} {default}
intx Tier3DelayOn = 5
{product} {default}
intx Tier3InvocationThreshold = 200
{product} {default}
intx Tier3InvokeNotifyFreqLog = 10
{product} {default}
intx Tier3LoadFeedback = 5
{product} {default}
intx Tier3MinInvocationThreshold = 100
{product} {default}
intx Tier4BackEdgeThreshold = 40000
{product} {default}
intx Tier4CompileThreshold = 15000
{product} {default}
intx Tier4InvocationThreshold = 5000
{product} {default}
intx Tier4LoadFeedback = 3
{product} {default}
intx Tier4MinInvocationThreshold = 600
{product} {default}
bool TieredCompilation = true
{pd product} {default}
intx TieredCompileTaskTimeout = 50
{product} {default}
intx TieredRateUpdateMaxTime = 25
{product} {default}
intx TieredRateUpdateMinTime = 1
{product} {default}
intx TieredStopAtLevel = 4
{product} {default}
bool TimeLinearScan = false
{C1 product} {default}
bool TraceCompilerThreads = false
{diagnostic} {default}
ccstr TraceJVMTI =
{product} {default}
bool TraceJVMTIObjectTagging = false
{diagnostic} {default}
bool TraceNMethodInstalls = false
{diagnostic} {default}
bool TraceSpilling = false
{C2 diagnostic} {default}
bool TraceSuspendWaitFailures = false
{product} {default}
bool TraceTypeProfile = false
{C2 diagnostic} {default}
intx TrackedInitializationLimit = 50
{C2 product} {default}
bool TransmitErrorReport = false
{product} {default}
bool TrapBasedNullChecks = false
{pd product} {default}
bool TrapBasedRangeChecks = false
{C2 pd product} {default}
intx TypeProfileArgsLimit = 2
{product} {default}
uintx TypeProfileLevel = 111
{pd product} {default}
intx TypeProfileMajorReceiverPercent = 90
{C2 product} {default}
intx TypeProfileParmsLimit = 2
{product} {default}
intx TypeProfileWidth = 2
{product} {default}
intx UnguardOnExecutionViolation = 0
{product} {default}
bool UnlinkSymbolsALot = false
{product} {default}
bool UnlockDiagnosticVMOptions = true
{diagnostic} {command line}
bool UseAES = true
{product} {default}
bool UseAESCTRIntrinsics = true
{diagnostic} {default}
bool UseAESIntrinsics = true
{diagnostic} {default}
bool UseAOTStrictLoading = false
{diagnostic} {default}
intx UseAVX = 3
{ARCH product} {default}
bool UseAdaptiveGCBoundary = false
{product} {default}
bool UseAdaptiveGenerationSizePolicyAtMajorCollection = true
{product} {default}
bool UseAdaptiveGenerationSizePolicyAtMinorCollection = true
{product} {default}
bool UseAdaptiveNUMAChunkSizing = true
{product} {default}
bool UseAdaptiveSizeDecayMajorGCCost = true
{product} {default}
bool UseAdaptiveSizePolicy = true
{product} {default}
bool UseAdaptiveSizePolicyFootprintGoal = true
{product} {default}
bool UseAdaptiveSizePolicyWithSystemGC = false
{product} {default}
bool UseAddressNop = true
{ARCH product} {default}
bool UseAdler32Intrinsics = false
{diagnostic} {default}
bool UseBASE64Intrinsics = true
{product} {default}
bool UseBMI1Instructions = true
{ARCH product} {default}
bool UseBMI2Instructions = true
{ARCH product} {default}
bool UseBiasedLocking = true
{product} {default}
bool UseBimorphicInlining = true
{C2 product} {default}
int UseBootstrapCallInfo = 1
{diagnostic} {default}
bool UseCLMUL = true
{ARCH product} {default}
bool UseCMSBestFit = true
{product} {default}
bool UseCMSInitiatingOccupancyOnly = false
{product} {default}
bool UseCMoveUnconditionally = false
{C2 product} {default}
bool UseCRC32CIntrinsics = true
{diagnostic} {default}
bool UseCRC32Intrinsics = true
{diagnostic} {default}
bool UseCharacterCompareIntrinsics = false
{C2 diagnostic} {default}
bool UseCodeAging = true
{product} {default}
bool UseCodeCacheFlushing = true
{product} {default}
bool UseCompiler = true
{product} {default}
bool UseCompressedClassPointers = true
{lp64_product} {ergonomic}
bool UseCompressedOops = true
{lp64_product} {ergonomic}
bool UseConcMarkSweepGC = false
{product} {default}
bool UseCondCardMark = false
{product} {default}
bool UseContainerCpuShares = false
{product} {default}
bool UseContainerSupport = true
{product} {default}
bool UseCopySignIntrinsic = false
{diagnostic} {default}
bool UseCountLeadingZerosInstruction = true
{ARCH product} {default}
bool UseCountTrailingZerosInstruction = true
{ARCH product} {default}
bool UseCountedLoopSafepoints = true
{C2 product} {default}
bool UseCounterDecay = true
{product} {default}
bool UseCpuAllocPath = false
{diagnostic} {default}
bool UseDivMod = true
{C2 product} {default}
bool UseDynamicNumberOfCompilerThreads = true
{product} {default}
bool UseDynamicNumberOfGCThreads = true
{product} {default}
bool UseFMA = true
{product} {default}
bool UseFPUForSpilling = true
{C2 product} {default}
bool UseFastJNIAccessors = true
{product} {default}
bool UseFastStosb = true
{ARCH product} {default}
bool UseG1GC = true
{product} {command line}
bool UseGCOverheadLimit = true
{product} {default}
bool UseGCTaskAffinity = false
{product} {default}
bool UseGHASHIntrinsics = true
{diagnostic} {default}
bool UseHeavyMonitors = false
{product} {default}
bool UseHugeTLBFS = false
{product} {default}
bool UseImplicitStableValues = true
{C2 diagnostic} {default}
bool UseIncDec = true
{ARCH diagnostic} {default}
bool UseInlineCaches = true
{product} {default}
bool UseInlineDepthForSpeculativeTypes = true
{C2 diagnostic} {default}
bool UseInterpreter = true
{product} {default}
bool UseJumpTables = true
{C2 product} {default}
bool UseLWPSynchronization = true
{product} {default}
bool UseLargePages = false
{pd product} {default}
bool UseLargePagesInMetaspace = false
{product} {default}
bool UseLargePagesIndividualAllocation = false
{pd product} {default}
bool UseLegacyJNINameEscaping = false
{product} {default}
bool UseLibmIntrinsic = true
{ARCH diagnostic} {default}
bool UseLinuxPosixThreadCPUClocks = true
{product} {default}
bool UseLoopCounter = true
{product} {default}
bool UseLoopInvariantCodeMotion = true
{C1 product} {default}
bool UseLoopPredicate = true
{C2 product} {default}
bool UseMathExactIntrinsics = true
{C2 diagnostic} {default}
bool UseMaximumCompactionOnSystemGC = true
{product} {default}
bool UseMembar = true
{pd product} {default}
bool UseMontgomeryMultiplyIntrinsic = true
{C2 diagnostic} {default}
bool UseMontgomerySquareIntrinsic = true
{C2 diagnostic} {default}
bool UseMulAddIntrinsic = true
{C2 diagnostic} {default}
bool UseMultiplyToLenIntrinsic = true
{C2 diagnostic} {default}
bool UseNUMA = false
{product} {default}
bool UseNUMAInterleaving = false
{product} {default}
bool UseNewCode = false
{diagnostic} {default}
bool UseNewCode2 = false
{diagnostic} {default}
bool UseNewCode3 = false
{diagnostic} {default}
bool UseNewLongLShift = false
{ARCH product} {default}
bool UseOSErrorReporting = false
{pd product} {default}
bool UseOnStackReplacement = true
{pd product} {default}
bool UseOnlyInlinedBimorphic = true
{C2 product} {default}
bool UseOprofile = false
{product} {default}
bool UseOptoBiasInlining = true
{C2 product} {default}
bool UsePSAdaptiveSurvivorSizePolicy = true
{product} {default}
bool UseParallelGC = false
{product} {default}
bool UseParallelOldGC = false
{product} {default}
bool UsePerfData = true
{product} {default}
bool UsePopCountInstruction = true
{product} {default}
bool UseProfiledLoopPredicate = true
{C2 product} {default}
bool UseRDPCForConstantTableBase = false
{C2 product} {default}
bool UseRTMDeopt = false
{ARCH product} {default}
bool UseRTMLocking = false
{ARCH product} {default}
bool UseSHA = true
{product} {default}
bool UseSHA1Intrinsics = false
{diagnostic} {default}
bool UseSHA256Intrinsics = true
{diagnostic} {default}
bool UseSHA512Intrinsics = true
{diagnostic} {default}
bool UseSHM = false
{product} {default}
intx UseSSE = 4
{product} {default}
bool UseSSE42Intrinsics = true
{ARCH product} {default}
bool UseSemaphoreGCThreadsSynchronization = true
{diagnostic} {default}
bool UseSerialGC = false
{product} {default}
bool UseSharedSpaces = false
{product} {default}
bool UseShenandoahGC = false
{product} {default}
bool UseSignalChaining = true
{product} {default}
bool UseSignumIntrinsic = false
{diagnostic} {default}
bool UseSquareToLenIntrinsic = true
{C2 diagnostic} {default}
bool UseStoreImmI16 = false
{ARCH product} {default}
bool UseStringDeduplication = false
{product} {default}
bool UseSubwordForMaxVector = true
{C2 product} {default}
bool UseSuperWord = true
{C2 product} {default}
bool UseSwitchProfiling = true
{diagnostic} {default}
bool UseTLAB = true
{pd product} {default}
bool UseThreadPriorities = true
{pd product} {default}
bool UseTransparentHugePages = false
{product} {default}
bool UseTypeProfile = true
{product} {default}
bool UseTypeSpeculation = true
{C2 product} {default}
bool UseUnalignedAccesses = true
{diagnostic} {default}
bool UseUnalignedLoadStores = true
{ARCH product} {default}
bool UseVectorCmov = false
{C2 product} {default}
bool UseVectorizedMismatchIntrinsic = true
{diagnostic} {default}
bool UseXMMForArrayCopy = true
{product} {default}
bool UseXMMForObjInit = false
{ARCH product} {default}
bool UseXmmI2D = false
{ARCH product} {default}
bool UseXmmI2F = false
{ARCH product} {default}
bool UseXmmLoadAndClearUpper = true
{ARCH product} {default}
bool UseXmmRegToRegMoveAll = true
{ARCH product} {default}
bool VMThreadHintNoPreempt = false
{product} {default}
intx VMThreadPriority = -1
{product} {default}
intx VMThreadStackSize = 1024
{pd product} {default}
intx ValueMapInitialSize = 11
{C1 product} {default}
intx ValueMapMaxLoopSize = 8
{C1 product} {default}
intx ValueSearchLimit = 1000
{C2 product} {default}
bool VerifyAdapterCalls = false
{diagnostic} {default}
bool VerifyAfterGC = false
{diagnostic} {default}
bool VerifyBeforeExit = false
{diagnostic} {default}
bool VerifyBeforeGC = false
{diagnostic} {default}
bool VerifyBeforeIteration = false
{diagnostic} {default}
bool VerifyDuringGC = false
{diagnostic} {default}
bool VerifyDuringStartup = false
{diagnostic} {default}
intx VerifyGCLevel = 0
{diagnostic} {default}
uintx VerifyGCStartAt = 0
{diagnostic} {default}
ccstrlist VerifyGCType =
{diagnostic} {default}
bool VerifyMergedCPBytecodes = true
{product} {default}
bool VerifyMethodHandles = false
{diagnostic} {default}
bool VerifyObjectStartArray = true
{diagnostic} {default}
bool VerifyRememberedSets = false
{diagnostic} {default}
bool VerifySharedSpaces = false
{product} {default}
bool VerifyStringTableAtExit = false
{diagnostic} {default}
ccstrlist VerifySubSet =
{diagnostic} {default}
bool WhiteBoxAPI = false
{diagnostic} {default}
uintx YoungGenerationSizeIncrement = 20
{product} {default}
uintx YoungGenerationSizeSupplement = 80
{product} {default}
uintx YoungGenerationSizeSupplementDecay = 8
{product} {default}
size_t YoungPLABSize = 4096
{product} {default}
double ZAllocationSpikeTolerance = 2.000000
{product} {default}
uint ZCollectionInterval = 0
{product} {default}
bool ZConcurrentJNIWeakGlobalHandles = true
{diagnostic} {default}
bool ZConcurrentStringTable = true
{diagnostic} {default}
bool ZConcurrentVMWeakHandles = true
{diagnostic} {default}
double ZFragmentationLimit = 25.000000
{product} {default}
size_t ZMarkStacksMax = 8589934592
{product} {default}
bool ZOptimizeLoadBarriers = true
{diagnostic} {default}
ccstr ZPath =
{product} {default}
bool ZProactive = true
{diagnostic} {default}
bool ZStallOnOutOfMemory = true
{product} {default}
bool ZStatisticsForceTrace = false
{diagnostic} {default}
uint ZStatisticsInterval = 10
{product} {default}
bool ZSymbolTableUnloading = false
{diagnostic} {default}
bool ZUnmapBadViews = false
{diagnostic} {default}
bool ZVerifyForwarding = false
{diagnostic} {default}
bool ZVerifyMarking = false
{diagnostic} {default}
bool ZWeakRoots = true
{diagnostic} {default}
bool ZeroTLAB = false
{product} {default}
log4j:WARN No such property [fields] in
org.apache.log4j.EnhancedPatternLayout.
[INFO] 2023-10-10 10:39:20,345 [main]
io.confluent.agent.monitoring.DiskUsage premain - DiskUsage Agent: config :
/opt/confluentinc/etc/connect/disk-usage-agent.properties
I> No access restrictor found, access to any MBean is allowed
Jolokia: Agent started with URL http://10.53.1.161:7777/jolokia/
[INFO] 2023-10-10 10:39:21,308 [main]
org.apache.kafka.connect.runtime.WorkerInfo logAll - WorkerInfo values:
jvm.args = -Dhttps.proxyHost=, -Dhttps.proxyPort=,
-Dhttp.proxyHost=, -Dhttp.proxyPort=, -Dhttp.nonProxyHosts=,
-Dcom.sun.management.jmxremote,
-Dcom.sun.management.jmxremote.authenticate=false,
-Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/var/log/kafka,
-Dlog4j.configuration=file:/opt/confluentinc/etc/connect/log4j.properties,
-Djavax.net.ssl.trustStore=/mnt/sslcerts/truststore.p12,
-Djavax.net.ssl.trustStorePassword=mystorepassword,
-Djavax.net.ssl.keyStore=/mnt/sslcerts/keystore.p12,
-Djavax.net.ssl.keyStorePassword=mystorepassword,
-Djava.rmi.server.hostname=sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local,
-Dcom.sun.management.jmxremote=true,
-Dcom.sun.management.jmxremote.authenticate=false,
-Dcom.sun.management.jmxremote.local.only=false,
-Dcom.sun.management.jmxremote.port=7203,
-Dcom.sun.management.jmxremote.rmi.port=7203,
-Dcom.sun.management.jmxremote.ssl=false, -Djava.awt.headless=true,
-Djdk.tls.ephemeralDHKeySize=2048,
-Djdk.tls.server.enableSessionTicketExtension=false,
-XX:+ExplicitGCInvokesConcurrent, -XX:+PrintFlagsFinal,
-XX:+UnlockDiagnosticVMOptions, -XX:+UseG1GC, -XX:ConcGCThreads=1,
-XX:G1HeapRegionSize=16M, -XX:InitiatingHeapOccupancyPercent=35,
-XX:MaxGCPauseMillis=20, -XX:MaxMetaspaceFreeRatio=80,
-XX:MetaspaceSize=96m, -XX:MinMetaspaceFreeRatio=50,
-XX:ParallelGCThreads=1, -Xms1G, -Xmx1G,
-javaagent:/usr/share/java/cp-base-new/disk-usage-agent-7.3.2.jar=/opt/confluentinc/etc/connect/disk-usage-agent.properties,
-javaagent:/usr/share/java/cp-base-new/jolokia-jvm-1.7.1.jar=port=7777,host=0.0.0.0,
-javaagent:/usr/share/java/cp-base-new/jmx_prometheus_javaagent-0.17.2.jar=7778:/mnt/config/shared/jmx-exporter.yaml
jvm.spec = Azul Systems, Inc., OpenJDK 64-Bit Server VM, 11.0.18,
11.0.18+10-LTS
jvm.classpath =
/usr/share/java/confluent-security/connect/activation-1.1.1.jar:/usr/share/java/confluent-security/connect/agrona-1.15.2.jar:/usr/share/java/confluent-security/connect/annotations-13.0.jar:/usr/share/java/confluent-security/connect/aopalliance-repackaged-2.6.1.jar:/usr/share/java/confluent-security/connect/api-common-2.1.5.jar:/usr/share/java/confluent-security/connect/argparse4j-0.7.0.jar:/usr/share/java/confluent-security/connect/asm-9.3.jar:/usr/share/java/confluent-security/connect/asm-analysis-9.3.jar:/usr/share/java/confluent-security/connect/asm-commons-9.3.jar:/usr/share/java/confluent-security/connect/asm-tree-9.3.jar:/usr/share/java/confluent-security/connect/audience-annotations-0.5.0.jar:/usr/share/java/confluent-security/connect/auth-metadata-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/authorizer-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/auto-common-0.10.jar:/usr/share/java/confluent-security/connect/auto-service-1.0-rc7.jar:/usr/share/java/confluent-security/connect/auto-service-annotations-1.0-rc7.jar:/usr/share/java/confluent-security/connect/auto-value-annotations-1.9.jar:/usr/share/java/confluent-security/connect/avro-1.11.0.jar:/usr/share/java/confluent-security/connect/aws-java-sdk-core-1.12.268.jar:/usr/share/java/confluent-security/connect/aws-java-sdk-kms-1.12.268.jar:/usr/share/java/confluent-security/connect/aws-java-sdk-s3-1.12.268.jar:/usr/share/java/confluent-security/connect/aws-java-sdk-sts-1.12.268.jar:/usr/share/java/confluent-security/connect/bc-fips-1.0.2.3.jar:/usr/share/java/confluent-security/connect/bcpkix-fips-1.0.6.jar:/usr/share/java/confluent-security/connect/bctls-fips-1.0.13.jar:/usr/share/java/confluent-security/connect/broker-plugins-7.3.2-ce-test.jar:/usr/share/java/confluent-security/connect/cel-core-0.3.5.jar:/usr/share/java/confluent-security/connect/cel-generated-antlr-0.3.5.jar:/usr/share/java/confluent-security/connect/cel-generated-pb-0.3.5.jar:/usr/share/java/confluent-security/connect/checker-qual-3.8.0.jar:/usr/share/java/confluent-security/connect/classgraph-4.8.21.jar:/usr/share/java/confluent-security/connect/classmate-1.3.4.jar:/usr/share/java/confluent-security/connect/common-config-7.3.2.jar:/usr/share/java/confluent-security/connect/common-utils-7.3.2.jar:/usr/share/java/confluent-security/connect/commons-cli-1.4.jar:/usr/share/java/confluent-security/connect/commons-codec-1.13.jar:/usr/share/java/confluent-security/connect/commons-collections-3.2.2.jar:/usr/share/java/confluent-security/connect/commons-compress-1.21.jar:/usr/share/java/confluent-security/connect/commons-digester-2.1.jar:/usr/share/java/confluent-security/connect/commons-lang3-3.12.0.jar:/usr/share/java/confluent-security/connect/commons-logging-1.2.jar:/usr/share/java/confluent-security/connect/commons-validator-1.7.jar:/usr/share/java/confluent-security/connect/confluent-connect-secret-registry-plugin-7.3.2.jar:/usr/share/java/confluent-security/connect/confluent-connect-security-plugin-7.3.2.jar:/usr/share/java/confluent-security/connect/confluent-licensing-new-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/confluent-security-plugins-common-7.3.2.jar:/usr/share/java/confluent-security/connect/confluent-serializers-new-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/error_prone_annotations-2.5.1.jar:/usr/share/java/confluent-security/connect/everit-json-schema-1.14.1.jar:/usr/share/java/confluent-security/connect/failureaccess-1.0.1.jar:/usr/share/java/confluent-security/connect/flatbuffers-java-2.0.3.jar:/usr/share/java/confluent-security/connect/gax-2.16.0.jar:/usr/share/java/confluent-security/connect/gax-httpjson-0.101.0.jar:/usr/share/java/confluent-security/connect/google-api-client-1.34.0.jar:/usr/share/java/confluent-security/connect/google-api-services-cloudkms-v1-rev108-1.25.0.jar:/usr/share/java/confluent-security/connect/google-api-services-storage-v1-rev20220401-1.32.1.jar:/usr/share/java/confluent-security/connect/google-auth-library-credentials-1.6.0.jar:/usr/share/java/confluent-security/connect/google-auth-library-oauth2-http-1.6.0.jar:/usr/share/java/confluent-security/connect/google-cloud-core-2.6.0.jar:/usr/share/java/confluent-security/connect/google-cloud-core-http-2.6.0.jar:/usr/share/java/confluent-security/connect/google-cloud-storage-2.6.1.jar:/usr/share/java/confluent-security/connect/google-http-client-1.41.7.jar:/usr/share/java/confluent-security/connect/google-http-client-apache-v2-1.41.7.jar:/usr/share/java/confluent-security/connect/google-http-client-appengine-1.41.7.jar:/usr/share/java/confluent-security/connect/google-http-client-gson-1.41.7.jar:/usr/share/java/confluent-security/connect/google-http-client-jackson2-1.41.7.jar:/usr/share/java/confluent-security/connect/google-oauth-client-1.33.3.jar:/usr/share/java/confluent-security/connect/grpc-context-1.45.1.jar:/usr/share/java/confluent-security/connect/gson-2.9.0.jar:/usr/share/java/confluent-security/connect/guava-30.1.1-jre.jar:/usr/share/java/confluent-security/connect/handy-uri-templates-2.1.8.jar:/usr/share/java/confluent-security/connect/hibernate-validator-6.1.7.Final.jar:/usr/share/java/confluent-security/connect/hk2-api-2.6.1.jar:/usr/share/java/confluent-security/connect/hk2-locator-2.6.1.jar:/usr/share/java/confluent-security/connect/hk2-utils-2.6.1.jar:/usr/share/java/confluent-security/connect/http2-common-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/http2-hpack-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/http2-server-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/httpclient-4.5.13.jar:/usr/share/java/confluent-security/connect/httpcore-4.4.13.jar:/usr/share/java/confluent-security/connect/internal-rest-server-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/ion-java-1.0.2.jar:/usr/share/java/confluent-security/connect/j2objc-annotations-1.3.jar:/usr/share/java/confluent-security/connect/jackson-annotations-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-core-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-dataformat-cbor-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-dataformat-csv-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-dataformat-properties-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-dataformat-yaml-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-datatype-guava-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-datatype-jdk8-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-datatype-joda-2.13.4.jar:/usr/share/java/confluent-security/connect/jakarta.el-3.0.4.jar:/usr/share/java/confluent-security/connect/jackson-datatype-jsr310-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-jaxrs-base-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-jaxrs-json-provider-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-module-jaxb-annotations-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-module-parameter-names-2.13.4.jar:/usr/share/java/confluent-security/connect/jackson-module-scala_2.13-2.13.4.jar:/usr/share/java/confluent-security/connect/jakarta.activation-api-1.2.1.jar:/usr/share/java/confluent-security/connect/jakarta.annotation-api-1.3.5.jar:/usr/share/java/confluent-security/connect/jakarta.el-api-4.0.0.jar:/usr/share/java/confluent-security/connect/jakarta.inject-2.6.1.jar:/usr/share/java/confluent-security/connect/jakarta.validation-api-2.0.2.jar:/usr/share/java/confluent-security/connect/jakarta.ws.rs-api-2.1.6.jar:/usr/share/java/confluent-security/connect/jakarta.xml.bind-api-2.3.3.jar:/usr/share/java/confluent-security/connect/javapoet-1.13.0.jar:/usr/share/java/confluent-security/connect/javassist-3.25.0-GA.jar:/usr/share/java/confluent-security/connect/javax-websocket-client-impl-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/javax-websocket-server-impl-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/javax.annotation-api-1.3.2.jar:/usr/share/java/confluent-security/connect/javax.servlet-api-4.0.1.jar:/usr/share/java/confluent-security/connect/javax.websocket-api-1.0.jar:/usr/share/java/confluent-security/connect/javax.websocket-client-api-1.0.jar:/usr/share/java/confluent-security/connect/javax.ws.rs-api-2.1.1.jar:/usr/share/java/confluent-security/connect/jaxb-api-2.3.0.jar:/usr/share/java/confluent-security/connect/jbcrypt-0.4.jar:/usr/share/java/confluent-security/connect/jboss-logging-3.3.2.Final.jar:/usr/share/java/confluent-security/connect/jersey-bean-validation-2.36.jar:/usr/share/java/confluent-security/connect/jersey-client-2.36.jar:/usr/share/java/confluent-security/connect/jersey-common-2.36.jar:/usr/share/java/confluent-security/connect/jersey-container-servlet-2.36.jar:/usr/share/java/confluent-security/connect/jersey-container-servlet-core-2.36.jar:/usr/share/java/confluent-security/connect/jersey-hk2-2.36.jar:/usr/share/java/confluent-security/connect/jersey-server-2.36.jar:/usr/share/java/confluent-security/connect/jetty-alpn-java-server-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-alpn-server-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-annotations-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-client-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-continuation-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-http-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-io-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-jaas-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-jmx-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-jndi-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-plus-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-security-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-server-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-servlet-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-servlets-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-util-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-util-ajax-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-webapp-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jetty-xml-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/jmespath-java-1.12.268.jar:/usr/share/java/confluent-security/connect/joda-time-2.10.8.jar:/usr/share/java/confluent-security/connect/jopt-simple-5.0.4.jar:/usr/share/java/confluent-security/connect/jose4j-0.7.9.jar:/usr/share/java/confluent-security/connect/json-20220320.jar:/usr/share/java/confluent-security/connect/jsr305-3.0.2.jar:/usr/share/java/confluent-security/connect/kafka-avro-serializer-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-client-plugins-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-connect-avro-converter-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-connect-avro-data-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-connect-json-schema-converter-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-connect-protobuf-converter-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-json-schema-provider-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-json-schema-serializer-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-json-serializer-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-metadata-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-protobuf-provider-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-protobuf-serializer-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-protobuf-types-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-raft-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-schema-converter-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-schema-registry-client-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-schema-serializer-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-secret-registry-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-secret-registry-client-7.3.2.jar:/usr/share/java/confluent-security/connect/kafka-server-common-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-shell-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-storage-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka-storage-api-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kafka_2.13-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/kotlin-reflect-1.7.0.jar:/usr/share/java/confluent-security/connect/kotlin-script-runtime-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-scripting-common-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-scripting-compiler-embeddable-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-scripting-jvm-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-scripting-compiler-impl-embeddable-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-stdlib-1.6.0.jar:/usr/share/java/confluent-security/connect/kotlin-stdlib-common-1.6.10.jar:/usr/share/java/confluent-security/connect/kotlin-stdlib-jdk7-1.6.10.jar:/usr/share/java/confluent-security/connect/kotlin-stdlib-jdk8-1.6.10.jar:/usr/share/java/confluent-security/connect/kotlinpoet-1.12.0.jar:/usr/share/java/confluent-security/connect/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/share/java/confluent-security/connect/logredactor-1.0.10.jar:/usr/share/java/confluent-security/connect/logredactor-metrics-1.0.10.jar:/usr/share/java/confluent-security/connect/mbknor-jackson-jsonschema_2.13-1.0.39.jar:/usr/share/java/confluent-security/connect/metrics-core-2.2.0.jar:/usr/share/java/confluent-security/connect/metrics-core-4.1.12.1.jar:/usr/share/java/confluent-security/connect/minimal-json-0.9.5.jar:/usr/share/java/confluent-security/connect/netty-all-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-buffer-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-dns-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-haproxy-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-http-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-http2-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-memcache-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-mqtt-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-redis-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-smtp-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-socks-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-stomp-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-codec-xml-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-common-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-handler-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-handler-proxy-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-resolver-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-resolver-dns-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-resolver-dns-classes-macos-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-resolver-dns-native-macos-4.1.79.Final-osx-aarch_64.jar:/usr/share/java/confluent-security/connect/netty-resolver-dns-native-macos-4.1.79.Final-osx-x86_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final-linux-aarch_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final-linux-x86_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final-osx-aarch_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final-osx-x86_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final-windows-x86_64.jar:/usr/share/java/confluent-security/connect/netty-tcnative-boringssl-static-2.0.53.Final.jar:/usr/share/java/confluent-security/connect/netty-tcnative-classes-2.0.53.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-classes-epoll-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-classes-kqueue-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-native-epoll-4.1.79.Final-linux-aarch_64.jar:/usr/share/java/confluent-security/connect/netty-transport-native-epoll-4.1.79.Final-linux-x86_64.jar:/usr/share/java/confluent-security/connect/netty-transport-native-kqueue-4.1.79.Final-osx-aarch_64.jar:/usr/share/java/confluent-security/connect/netty-transport-native-kqueue-4.1.79.Final-osx-x86_64.jar:/usr/share/java/confluent-security/connect/netty-transport-native-unix-common-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-rxtx-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-sctp-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/netty-transport-udt-4.1.79.Final.jar:/usr/share/java/confluent-security/connect/okio-3.0.0.jar:/usr/share/java/confluent-security/connect/okio-jvm-3.0.0.jar:/usr/share/java/confluent-security/connect/opencensus-api-0.31.0.jar:/usr/share/java/confluent-security/connect/opencensus-contrib-http-util-0.31.0.jar:/usr/share/java/confluent-security/connect/osgi-resource-locator-1.0.3.jar:/usr/share/java/confluent-security/connect/paranamer-2.8.jar:/usr/share/java/confluent-security/connect/proto-google-common-protos-2.5.1.jar:/usr/share/java/confluent-security/connect/proto-google-iam-v1-1.3.1.jar:/usr/share/java/confluent-security/connect/protobuf-java-3.19.6.jar:/usr/share/java/confluent-security/connect/protobuf-java-util-3.19.6.jar:/usr/share/java/confluent-security/connect/rbac-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/re2j-1.6.jar:/usr/share/java/confluent-security/connect/rest-authorizer-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/rest-utils-7.3.2.jar:/usr/share/java/confluent-security/connect/scala-collection-compat_2.13-2.6.0.jar:/usr/share/java/confluent-security/connect/scala-java8-compat_2.13-1.0.2.jar:/usr/share/java/confluent-security/connect/scala-library-2.13.10.jar:/usr/share/java/confluent-security/connect/scala-logging_2.13-3.9.4.jar:/usr/share/java/confluent-security/connect/scala-reflect-2.13.10.jar:/usr/share/java/confluent-security/connect/security-extensions-7.3.2-ce.jar:/usr/share/java/confluent-security/connect/snakeyaml-1.32.jar:/usr/share/java/confluent-security/connect/swagger-annotations-2.1.10.jar:/usr/share/java/confluent-security/connect/threetenbp-1.6.0.jar:/usr/share/java/confluent-security/connect/tink-1.6.0.jar:/usr/share/java/confluent-security/connect/tink-gcpkms-1.6.0.jar:/usr/share/java/confluent-security/connect/validation-api-2.0.1.Final.jar:/usr/share/java/confluent-security/connect/websocket-api-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/websocket-client-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/websocket-common-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/websocket-server-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/websocket-servlet-9.4.48.v20220622.jar:/usr/share/java/confluent-security/connect/wire-runtime-jvm-4.4.3.jar:/usr/share/java/confluent-security/connect/wire-schema-jvm-4.4.3.jar:/usr/share/java/confluent-security/connect/zookeeper-3.6.3.jar:/usr/share/java/confluent-security/connect/zookeeper-jute-3.6.3.jar:/usr/share/java/kafka/KeePassJava2-2.1.4.jar:/usr/share/java/kafka/KeePassJava2-dom-2.1.4.jar:/usr/share/java/kafka/KeePassJava2-jaxb-2.1.4.jar:/usr/share/java/kafka/KeePassJava2-kdb-2.1.4.jar:/usr/share/java/kafka/KeePassJava2-kdbx-2.1.4.jar:/usr/share/java/kafka/KeePassJava2-simple-2.1.4.jar:/usr/share/java/kafka/aalto-xml-1.0.0.jar:/usr/share/java/kafka/accessors-smart-2.4.7.jar:/usr/share/java/kafka/activation-1.1.1.jar:/usr/share/java/kafka/agrona-1.15.2.jar:/usr/share/java/kafka/annotations-13.0.jar:/usr/share/java/kafka/annotations-15.0.jar:/usr/share/java/kafka/annotations-3.0.1.jar:/usr/share/java/kafka/aopalliance-repackaged-2.6.1.jar:/usr/share/java/kafka/api-common-2.1.5.jar:/usr/share/java/kafka/argparse4j-0.7.0.jar:/usr/share/java/kafka/asm-9.1.jar:/usr/share/java/kafka/audience-annotations-0.5.0.jar:/usr/share/java/kafka/auth-metadata-7.3.2-ce.jar:/usr/share/java/kafka/auth-providers-7.3.2-ce.jar:/usr/share/java/kafka/authorizer-7.3.2-ce.jar:/usr/share/java/kafka/auto-common-0.10.jar:/usr/share/java/kafka/auto-service-1.0-rc7.jar:/usr/share/java/kafka/auto-service-annotations-1.0-rc7.jar:/usr/share/java/kafka/auto-value-annotations-1.9.jar:/usr/share/java/kafka/aws-java-sdk-core-1.12.268.jar:/usr/share/java/kafka/aws-java-sdk-kms-1.12.268.jar:/usr/share/java/kafka/aws-java-sdk-s3-1.12.268.jar:/usr/share/java/kafka/aws-java-sdk-sts-1.12.268.jar:/usr/share/java/kafka/azure-core-1.18.0.jar:/usr/share/java/kafka/azure-core-http-netty-1.10.1.jar:/usr/share/java/kafka/azure-identity-1.3.3.jar:/usr/share/java/kafka/azure-storage-blob-12.12.0.jar:/usr/share/java/kafka/azure-storage-common-12.12.0.jar:/usr/share/java/kafka/azure-storage-internal-avro-12.0.5.jar:/usr/share/java/kafka/bc-fips-1.0.2.3.jar:/usr/share/java/kafka/bcpkix-fips-1.0.6.jar:/usr/share/java/kafka/bctls-fips-1.0.13.jar:/usr/share/java/kafka/brave-5.13.3.jar:/usr/share/java/kafka/brave-instrumentation-http-5.13.3.jar:/usr/share/java/kafka/broker-plugins-7.3.2-ce.jar:/usr/share/java/kafka/ce-sbk_2.13-7.3.2-ce.jar:/usr/share/java/kafka/cel-core-0.3.5.jar:/usr/share/java/kafka/cel-generated-antlr-0.3.5.jar:/usr/share/java/kafka/cel-generated-pb-0.3.5.jar:/usr/share/java/kafka/checker-qual-3.21.4.jar:/usr/share/java/kafka/checker-qual-3.5.0.jar:/usr/share/java/kafka/classgraph-4.8.138.jar:/usr/share/java/kafka/client-java-14.0.0.jar:/usr/share/java/kafka/client-java-api-14.0.0.jar:/usr/share/java/kafka/client-java-proto-14.0.0.jar:/usr/share/java/kafka/cloudevents-api-2.3.0.jar:/usr/share/java/kafka/cloudevents-core-2.3.0.jar:/usr/share/java/kafka/cloudevents-json-jackson-2.3.0.jar:/usr/share/java/kafka/cloudevents-kafka-2.3.0.jar:/usr/share/java/kafka/cloudevents-protobuf-2.3.0.jar:/usr/share/java/kafka/commons-cli-1.4.jar:/usr/share/java/kafka/commons-codec-1.15.jar:/usr/share/java/kafka/commons-collections4-4.4.jar:/usr/share/java/kafka/commons-compress-1.21.jar:/usr/share/java/kafka/commons-io-2.11.0.jar:/usr/share/java/kafka/commons-lang3-3.11.jar:/usr/share/java/kafka/commons-logging-1.2.jar:/usr/share/java/kafka/commons-math3-3.6.1.jar:/usr/share/java/kafka/confluent-audit-7.3.2-ce.jar:/usr/share/java/kafka/confluent-licensing-new-7.3.2-ce.jar:/usr/share/java/kafka/confluent-resource-names-7.3.2-ce.jar:/usr/share/java/kafka/confluent-serializers-new-7.3.2-ce.jar:/usr/share/java/kafka/connect-api-7.3.2-ce.jar:/usr/share/java/kafka/connect-basic-auth-extension-7.3.2-ce.jar:/usr/share/java/kafka/connect-ce-logs-7.3.2-ce.jar:/usr/share/java/kafka/connect-json-7.3.2-ce.jar:/usr/share/java/kafka/connect-mirror-7.3.2-ce.jar:/usr/share/java/kafka/connect-mirror-client-7.3.2-ce.jar:/usr/share/java/kafka/connect-runtime-7.3.2-ce.jar:/usr/share/java/kafka/connect-transforms-7.3.2-ce.jar:/usr/share/java/kafka/connector-datapreview-extension-7.3.2-ce.jar:/usr/share/java/kafka/content-type-2.1.jar:/usr/share/java/kafka/core-1.54.0.0.jar:/usr/share/java/kafka/database-2.1.4.jar:/usr/share/java/kafka/error_prone_annotations-2.3.4.jar:/usr/share/java/kafka/failureaccess-1.0.1.jar:/usr/share/java/kafka/flatbuffers-java-2.0.3.jar:/usr/share/java/kafka/gax-2.16.0.jar:/usr/share/java/kafka/gax-httpjson-0.101.0.jar:/usr/share/java/kafka/google-api-client-1.34.0.jar:/usr/share/java/kafka/google-api-services-cloudkms-v1-rev108-1.25.0.jar:/usr/share/java/kafka/google-api-services-storage-v1-rev20220401-1.32.1.jar:/usr/share/java/kafka/google-auth-library-credentials-1.6.0.jar:/usr/share/java/kafka/google-auth-library-oauth2-http-1.6.0.jar:/usr/share/java/kafka/google-cloud-core-2.6.0.jar:/usr/share/java/kafka/google-cloud-core-http-2.6.0.jar:/usr/share/java/kafka/google-cloud-storage-2.6.1.jar:/usr/share/java/kafka/google-http-client-1.41.7.jar:/usr/share/java/kafka/gson-2.9.0.jar:/usr/share/java/kafka/google-http-client-apache-v2-1.41.7.jar:/usr/share/java/kafka/google-http-client-appengine-1.41.7.jar:/usr/share/java/kafka/google-http-client-gson-1.41.7.jar:/usr/share/java/kafka/google-http-client-jackson2-1.41.7.jar:/usr/share/java/kafka/google-oauth-client-1.33.3.jar:/usr/share/java/kafka/grpc-context-1.45.1.jar:/usr/share/java/kafka/gson-fire-1.8.5.jar:/usr/share/java/kafka/guava-30.0-jre.jar:/usr/share/java/kafka/hk2-api-2.6.1.jar:/usr/share/java/kafka/hk2-locator-2.6.1.jar:/usr/share/java/kafka/hk2-utils-2.6.1.jar:/usr/share/java/kafka/httpclient-4.5.13.jar:/usr/share/java/kafka/httpcore-4.4.13.jar:/usr/share/java/kafka/httpcore-4.4.15.jar:/usr/share/java/kafka/internal-rest-server-7.3.2-ce.jar:/usr/share/java/kafka/ion-java-1.0.2.jar:/usr/share/java/kafka/j2objc-annotations-1.3.jar:/usr/share/java/kafka/jackson-annotations-2.13.3.jar:/usr/share/java/kafka/jackson-core-2.13.3.jar:/usr/share/java/kafka/jackson-databind-2.13.4.2.jar:/usr/share/java/kafka/jackson-dataformat-cbor-2.13.3.jar:/usr/share/java/kafka/jackson-dataformat-csv-2.13.3.jar:/usr/share/java/kafka/jackson-dataformat-properties-2.13.4.jar:/usr/share/java/kafka/jackson-dataformat-xml-2.13.4.jar:/usr/share/java/kafka/jackson-dataformat-yaml-2.13.4.jar:/usr/share/java/kafka/jackson-datatype-jdk8-2.13.3.jar:/usr/share/java/kafka/jackson-datatype-jsr310-2.13.3.jar:/usr/share/java/kafka/jackson-datatype-protobuf-0.9.11-jackson2.9.jar:/usr/share/java/kafka/jackson-jaxrs-base-2.13.3.jar:/usr/share/java/kafka/jackson-jaxrs-json-provider-2.13.3.jar:/usr/share/java/kafka/jackson-module-jaxb-annotations-2.13.3.jar:/usr/share/java/kafka/jackson-module-scala_2.13-2.13.3.jar:/usr/share/java/kafka/jakarta.activation-api-1.2.2.jar:/usr/share/java/kafka/jakarta.annotation-api-1.3.5.jar:/usr/share/java/kafka/jakarta.inject-2.6.1.jar:/usr/share/java/kafka/jakarta.validation-api-2.0.2.jar:/usr/share/java/kafka/jakarta.ws.rs-api-2.1.6.jar:/usr/share/java/kafka/jakarta.xml.bind-api-2.3.3.jar:/usr/share/java/kafka/javassist-3.27.0-GA.jar:/usr/share/java/kafka/javax.annotation-api-1.3.2.jar:/usr/share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/share/java/kafka/jaxb-api-2.3.0.jar:/usr/share/java/kafka/jbcrypt-0.4.jar:/usr/share/java/kafka/jcip-annotations-1.0-1.jar:/usr/share/java/kafka/jcip-annotations-1.0.jar:/usr/share/java/kafka/jersey-client-2.34.jar:/usr/share/java/kafka/jersey-common-2.34.jar:/usr/share/java/kafka/jersey-container-servlet-2.34.jar:/usr/share/java/kafka/jersey-container-servlet-core-2.34.jar:/usr/share/java/kafka/jersey-hk2-2.34.jar:/usr/share/java/kafka/jersey-server-2.34.jar:/usr/share/java/kafka/jetty-client-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-continuation-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-http-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-io-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-security-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-server-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-servlet-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-servlets-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-util-9.4.48.v20220622.jar:/usr/share/java/kafka/jetty-util-ajax-9.4.48.v20220622.jar:/usr/share/java/kafka/jline-3.21.0.jar:/usr/share/java/kafka/jmespath-java-1.12.268.jar:/usr/share/java/kafka/jna-5.6.0.jar:/usr/share/java/kafka/jna-platform-5.6.0.jar:/usr/share/java/kafka/joda-time-2.9.9.jar:/usr/share/java/kafka/jopt-simple-5.0.4.jar:/usr/share/java/kafka/jose4j-0.7.9.jar:/usr/share/java/kafka/json-smart-2.4.7.jar:/usr/share/java/kafka/jsr305-3.0.1.jar:/usr/share/java/kafka/jsr305-3.0.2.jar:/usr/share/java/kafka/kafka-client-plugins-7.3.2-ce.jar:/usr/share/java/kafka/kafka-clients-7.3.2-ce.jar:/usr/share/java/kafka/kafka-log4j-appender-7.3.2-ce.jar:/usr/share/java/kafka/kafka-metadata-7.3.2-ce.jar:/usr/share/java/kafka/kafka-raft-7.3.2-ce.jar:/usr/share/java/kafka/kafka-server-common-7.3.2-ce.jar:/usr/share/java/kafka/kafka-shell-7.3.2-ce.jar:/usr/share/java/kafka/kafka-storage-7.3.2-ce.jar:/usr/share/java/kafka/kafka-storage-api-7.3.2-ce.jar:/usr/share/java/kafka/kafka-streams-7.3.2-ce.jar:/usr/share/java/kafka/kafka-streams-examples-7.3.2-ce.jar:/usr/share/java/kafka/kafka-streams-scala_2.13-7.3.2-ce.jar:/usr/share/java/kafka/kafka-streams-test-utils-7.3.2-ce.jar:/usr/share/java/kafka/kafka-tools-7.3.2-ce.jar:/usr/share/java/kafka/kafka.jar:/usr/share/java/kafka/kafka_2.13-7.3.2-ce.jar:/usr/share/java/kafka/kotlin-stdlib-1.6.0.jar:/usr/share/java/kafka/kotlin-stdlib-common-1.6.0.jar:/usr/share/java/kafka/kotlin-stdlib-jdk7-1.6.0.jar:/usr/share/java/kafka/kotlin-stdlib-jdk8-1.6.0.jar:/usr/share/java/kafka/lang-tag-1.5.jar:/usr/share/java/kafka/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/share/java/kafka/logging-interceptor-4.9.1.jar:/usr/share/java/kafka/lz4-java-1.8.0.jar:/usr/share/java/kafka/maven-artifact-3.8.4.jar:/usr/share/java/kafka/metrics-core-2.2.0.jar:/usr/share/java/kafka/metrics-core-4.1.12.1.jar:/usr/share/java/kafka/msal4j-1.10.1.jar:/usr/share/java/kafka/msal4j-persistence-extension-1.1.0.jar:/usr/share/java/kafka/netty-all-4.1.79.Final.jar:/usr/share/java/kafka/netty-buffer-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-dns-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-haproxy-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-http-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-http2-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-memcache-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-mqtt-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-redis-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-smtp-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-socks-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-stomp-4.1.79.Final.jar:/usr/share/java/kafka/netty-codec-xml-4.1.79.Final.jar:/usr/share/java/kafka/netty-common-4.1.79.Final.jar:/usr/share/java/kafka/netty-handler-4.1.79.Final.jar:/usr/share/java/kafka/netty-handler-proxy-4.1.79.Final.jar:/usr/share/java/kafka/netty-resolver-4.1.79.Final.jar:/usr/share/java/kafka/netty-resolver-dns-4.1.79.Final.jar:/usr/share/java/kafka/netty-resolver-dns-classes-macos-4.1.79.Final.jar:/usr/share/java/kafka/netty-resolver-dns-native-macos-4.1.79.Final-osx-aarch_64.jar:/usr/share/java/kafka/netty-resolver-dns-native-macos-4.1.79.Final-osx-x86_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-linux-aarch_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-linux-x86_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-osx-aarch_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-osx-x86_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-windows-x86_64.jar:/usr/share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final.jar:/usr/share/java/kafka/netty-tcnative-classes-2.0.53.Final.jar:/usr/share/java/kafka/netty-transport-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-classes-epoll-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-classes-kqueue-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-native-epoll-4.1.79.Final-linux-aarch_64.jar:/usr/share/java/kafka/netty-transport-native-epoll-4.1.79.Final-linux-x86_64.jar:/usr/share/java/kafka/netty-transport-native-epoll-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-native-kqueue-4.1.79.Final-osx-aarch_64.jar:/usr/share/java/kafka/netty-transport-native-kqueue-4.1.79.Final-osx-x86_64.jar:/usr/share/java/kafka/netty-transport-native-unix-common-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-rxtx-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-sctp-4.1.79.Final.jar:/usr/share/java/kafka/netty-transport-udt-4.1.79.Final.jar:/usr/share/java/kafka/nimbus-jose-jwt-9.9.3.jar:/usr/share/java/kafka/oauth2-oidc-sdk-9.7.jar:/usr/share/java/kafka/okhttp-4.9.3.jar:/usr/share/java/kafka/okio-jvm-3.0.0.jar:/usr/share/java/kafka/opencensus-api-0.31.0.jar:/usr/share/java/kafka/opencensus-contrib-http-util-0.31.0.jar:/usr/share/java/kafka/opencensus-proto-0.2.0.jar:/usr/share/java/kafka/osgi-resource-locator-1.0.3.jar:/usr/share/java/kafka/paranamer-2.8.jar:/usr/share/java/kafka/plexus-utils-3.3.0.jar:/usr/share/java/kafka/proto-google-common-protos-2.8.3.jar:/usr/share/java/kafka/proto-google-iam-v1-1.3.1.jar:/usr/share/java/kafka/protobuf-java-3.19.6.jar:/usr/share/java/kafka/protobuf-java-util-3.19.6.jar:/usr/share/java/kafka/rbac-7.3.2-ce.jar:/usr/share/java/kafka/reactive-streams-1.0.3.jar:/usr/share/java/kafka/reactor-core-3.4.6.jar:/usr/share/java/kafka/reactor-netty-1.0.7.jar:/usr/share/java/kafka/reactor-netty-core-1.0.7.jar:/usr/share/java/kafka/reactor-netty-http-1.0.7.jar:/usr/share/java/kafka/reactor-netty-http-brave-1.0.7.jar:/usr/share/java/kafka/reflections-0.9.12.jar:/usr/share/java/kafka/reload4j-1.2.19.jar:/usr/share/java/kafka/rest-authorizer-7.3.2-ce.jar:/usr/share/java/kafka/rocksdbjni-6.29.4.1.jar:/usr/share/java/kafka/scala-collection-compat_2.13-2.6.0.jar:/usr/share/java/kafka/scala-java8-compat_2.13-1.0.2.jar:/usr/share/java/kafka/scala-library-2.13.10.jar:/usr/share/java/kafka/scala-logging_2.13-3.9.4.jar:/usr/share/java/kafka/scala-reflect-2.13.10.jar:/usr/share/java/kafka/simpleclient-0.12.0.jar:/usr/share/java/kafka/security-extensions-7.3.2-ce.jar:/usr/share/java/kafka/simpleclient_common-0.12.0.jar:/usr/share/java/kafka/simpleclient_httpserver-0.12.0.jar:/usr/share/java/kafka/simpleclient_tracer_common-0.12.0.jar:/usr/share/java/kafka/simpleclient_tracer_otel-0.12.0.jar:/usr/share/java/kafka/simpleclient_tracer_otel_agent-0.12.0.jar:/usr/share/java/kafka/slf4j-api-1.7.21.jar:/usr/share/java/kafka/slf4j-api-1.7.30.jar:/usr/share/java/kafka/slf4j-api-1.7.36.jar:/usr/share/java/kafka/slf4j-reload4j-1.7.36.jar:/usr/share/java/kafka/snakeyaml-1.32.jar:/usr/share/java/kafka/snappy-java-1.1.8.4.jar:/usr/share/java/kafka/stax2-api-4.2.1.jar:/usr/share/java/kafka/swagger-annotations-1.6.3.jar:/usr/share/java/kafka/swagger-annotations-2.2.0.jar:/usr/share/java/kafka/swagger-core-2.2.0.jar:/usr/share/java/kafka/swagger-integration-2.2.0.jar:/usr/share/java/kafka/swagger-jaxrs2-2.2.0.jar:/usr/share/java/kafka/swagger-models-2.2.0.jar:/usr/share/java/kafka/telemetry-api-3.282.0.jar:/usr/share/java/kafka/telemetry-client-3.282.0.jar:/usr/share/java/kafka/telemetry-events-7.3.2-ce.jar:/usr/share/java/kafka/telemetry-events-api-7.3.2-ce.jar:/usr/share/java/kafka/threetenbp-1.6.0.jar:/usr/share/java/kafka/tink-1.6.0.jar:/usr/share/java/kafka/tink-gcpkms-1.6.0.jar:/usr/share/java/kafka/trogdor-7.3.2-ce.jar:/usr/share/java/kafka/woodstox-core-6.3.1.jar:/usr/share/java/kafka/zipkin-2.23.2.jar:/usr/share/java/kafka/zipkin-reporter-2.16.3.jar:/usr/share/java/kafka/zipkin-reporter-brave-2.16.3.jar:/usr/share/java/kafka/zookeeper-3.6.3.jar:/usr/share/java/kafka/zookeeper-jute-3.6.3.jar:/usr/share/java/kafka/zstd-jni-1.5.2-1.jar:/usr/share/java/confluent-common/build-tools-7.3.2.jar:/usr/share/java/confluent-common/common-config-7.3.2.jar:/usr/share/java/confluent-common/common-metrics-7.3.2.jar:/usr/share/java/confluent-common/common-utils-7.3.2.jar:/usr/share/java/confluent-common/slf4j-api-1.7.36.jar:/usr/share/java/kafka-serde-tools/annotations-13.0.jar:/usr/share/java/kafka-serde-tools/avro-1.11.0.jar:/usr/share/java/kafka-serde-tools/checker-qual-3.8.0.jar:/usr/share/java/kafka-serde-tools/classgraph-4.8.21.jar:/usr/share/java/kafka-serde-tools/commons-collections-3.2.2.jar:/usr/share/java/kafka-serde-tools/commons-compress-1.21.jar:/usr/share/java/kafka-serde-tools/commons-digester-2.1.jar:/usr/share/java/kafka-serde-tools/commons-lang3-3.12.0.jar:/usr/share/java/kafka-serde-tools/commons-logging-1.2.jar:/usr/share/java/kafka-serde-tools/commons-validator-1.7.jar:/usr/share/java/kafka-serde-tools/error_prone_annotations-2.5.1.jar:/usr/share/java/kafka-serde-tools/everit-json-schema-1.14.1.jar:/usr/share/java/kafka-serde-tools/failureaccess-1.0.1.jar:/usr/share/java/kafka-serde-tools/gson-2.9.0.jar:/usr/share/java/kafka-serde-tools/guava-30.1.1-jre.jar:/usr/share/java/kafka-serde-tools/handy-uri-templates-2.1.8.jar:/usr/share/java/kafka-serde-tools/j2objc-annotations-1.3.jar:/usr/share/java/kafka-serde-tools/jackson-annotations-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-core-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-databind-2.13.4.2.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-guava-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-jdk8-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-joda-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-datatype-jsr310-2.13.4.jar:/usr/share/java/kafka-serde-tools/jackson-module-parameter-names-2.13.4.jar:/usr/share/java/kafka-serde-tools/javapoet-1.13.0.jar:/usr/share/java/kafka-serde-tools/joda-time-2.10.8.jar:/usr/share/java/kafka-serde-tools/json-20220320.jar:/usr/share/java/kafka-serde-tools/jsr305-3.0.2.jar:/usr/share/java/kafka-serde-tools/kafka-avro-serializer-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-converter-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-data-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-json-schema-converter-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-connect-protobuf-converter-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-json-schema-provider-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-json-schema-serializer-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-json-serializer-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-protobuf-provider-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-protobuf-serializer-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-protobuf-types-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-schema-converter-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-schema-registry-client-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-schema-serializer-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-streams-7.3.2-ccs.jar:/usr/share/java/kafka-serde-tools/kafka-streams-avro-serde-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-streams-json-schema-serde-7.3.2.jar:/usr/share/java/kafka-serde-tools/kafka-streams-protobuf-serde-7.3.2.jar:/usr/share/java/kafka-serde-tools/kotlin-reflect-1.7.0.jar:/usr/share/java/kafka-serde-tools/kotlin-script-runtime-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-common-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-compiler-embeddable-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-compiler-impl-embeddable-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-scripting-jvm-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-1.6.0.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-common-1.6.10.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-jdk7-1.6.10.jar:/usr/share/java/kafka-serde-tools/kotlin-stdlib-jdk8-1.6.10.jar:/usr/share/java/kafka-serde-tools/kotlinpoet-1.12.0.jar:/usr/share/java/kafka-serde-tools/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/share/java/kafka-serde-tools/logredactor-1.0.10.jar:/usr/share/java/kafka-serde-tools/logredactor-metrics-1.0.10.jar:/usr/share/java/kafka-serde-tools/mbknor-jackson-jsonschema_2.13-1.0.39.jar:/usr/share/java/kafka-serde-tools/minimal-json-0.9.5.jar:/usr/share/java/kafka-serde-tools/okio-3.0.0.jar:/usr/share/java/kafka-serde-tools/okio-jvm-3.0.0.jar:/usr/share/java/kafka-serde-tools/proto-google-common-protos-2.5.1.jar:/usr/share/java/kafka-serde-tools/protobuf-java-3.19.6.jar:/usr/share/java/kafka-serde-tools/protobuf-java-util-3.19.6.jar:/usr/share/java/kafka-serde-tools/re2j-1.6.jar:/usr/share/java/kafka-serde-tools/rocksdbjni-7.1.2.jar:/usr/share/java/kafka-serde-tools/scala-library-2.13.10.jar:/usr/share/java/kafka-serde-tools/slf4j-api-1.7.36.jar:/usr/share/java/kafka-serde-tools/swagger-annotations-2.1.10.jar:/usr/share/java/kafka-serde-tools/validation-api-2.0.1.Final.jar:/usr/share/java/kafka-serde-tools/wire-runtime-jvm-4.4.3.jar:/usr/share/java/kafka-serde-tools/wire-schema-jvm-4.4.3.jar:/usr/share/java/monitoring-interceptors/monitoring-interceptors-7.3.2.jar:/usr/bin/../ce-broker-plugins/build/libs/*:/usr/bin/../ce-broker-plugins/build/dependant-libs/*:/usr/bin/../ce-auth-providers/build/libs/*:/usr/bin/../ce-auth-providers/build/dependant-libs/*:/usr/bin/../ce-rest-server/build/libs/*:/usr/bin/../ce-rest-server/build/dependant-libs/*:/usr/bin/../ce-audit/build/libs/*:/usr/bin/../ce-audit/build/dependant-libs/*:/usr/bin/../ce-authorizer/build/libs/*:/usr/bin/../ce-authorizer/build/dependant-libs/*:/usr/bin/../share/java/kafka/KeePassJava2-2.1.4.jar:/usr/bin/../share/java/kafka/KeePassJava2-dom-2.1.4.jar:/usr/bin/../share/java/kafka/KeePassJava2-jaxb-2.1.4.jar:/usr/bin/../share/java/kafka/KeePassJava2-kdb-2.1.4.jar:/usr/bin/../share/java/kafka/KeePassJava2-kdbx-2.1.4.jar:/usr/bin/../share/java/kafka/KeePassJava2-simple-2.1.4.jar:/usr/bin/../share/java/kafka/aalto-xml-1.0.0.jar:/usr/bin/../share/java/kafka/accessors-smart-2.4.7.jar:/usr/bin/../share/java/kafka/activation-1.1.1.jar:/usr/bin/../share/java/kafka/agrona-1.15.2.jar:/usr/bin/../share/java/kafka/annotations-13.0.jar:/usr/bin/../share/java/kafka/annotations-15.0.jar:/usr/bin/../share/java/kafka/annotations-3.0.1.jar:/usr/bin/../share/java/kafka/aopalliance-repackaged-2.6.1.jar:/usr/bin/../share/java/kafka/api-common-2.1.5.jar:/usr/bin/../share/java/kafka/argparse4j-0.7.0.jar:/usr/bin/../share/java/kafka/asm-9.1.jar:/usr/bin/../share/java/kafka/audience-annotations-0.5.0.jar:/usr/bin/../share/java/kafka/auth-metadata-7.3.2-ce.jar:/usr/bin/../share/java/kafka/auth-providers-7.3.2-ce.jar:/usr/bin/../share/java/kafka/authorizer-7.3.2-ce.jar:/usr/bin/../share/java/kafka/auto-common-0.10.jar:/usr/bin/../share/java/kafka/auto-service-1.0-rc7.jar:/usr/bin/../share/java/kafka/auto-service-annotations-1.0-rc7.jar:/usr/bin/../share/java/kafka/auto-value-annotations-1.9.jar:/usr/bin/../share/java/kafka/aws-java-sdk-core-1.12.268.jar:/usr/bin/../share/java/kafka/aws-java-sdk-kms-1.12.268.jar:/usr/bin/../share/java/kafka/aws-java-sdk-s3-1.12.268.jar:/usr/bin/../share/java/kafka/aws-java-sdk-sts-1.12.268.jar:/usr/bin/../share/java/kafka/azure-core-1.18.0.jar:/usr/bin/../share/java/kafka/azure-core-http-netty-1.10.1.jar:/usr/bin/../share/java/kafka/azure-identity-1.3.3.jar:/usr/bin/../share/java/kafka/azure-storage-blob-12.12.0.jar:/usr/bin/../share/java/kafka/azure-storage-common-12.12.0.jar:/usr/bin/../share/java/kafka/azure-storage-internal-avro-12.0.5.jar:/usr/bin/../share/java/kafka/bc-fips-1.0.2.3.jar:/usr/bin/../share/java/kafka/bcpkix-fips-1.0.6.jar:/usr/bin/../share/java/kafka/bctls-fips-1.0.13.jar:/usr/bin/../share/java/kafka/brave-5.13.3.jar:/usr/bin/../share/java/kafka/brave-instrumentation-http-5.13.3.jar:/usr/bin/../share/java/kafka/broker-plugins-7.3.2-ce.jar:/usr/bin/../share/java/kafka/ce-sbk_2.13-7.3.2-ce.jar:/usr/bin/../share/java/kafka/cel-core-0.3.5.jar:/usr/bin/../share/java/kafka/cel-generated-antlr-0.3.5.jar:/usr/bin/../share/java/kafka/cel-generated-pb-0.3.5.jar:/usr/bin/../share/java/kafka/checker-qual-3.21.4.jar:/usr/bin/../share/java/kafka/checker-qual-3.5.0.jar:/usr/bin/../share/java/kafka/classgraph-4.8.138.jar:/usr/bin/../share/java/kafka/client-java-14.0.0.jar:/usr/bin/../share/java/kafka/client-java-api-14.0.0.jar:/usr/bin/../share/java/kafka/client-java-proto-14.0.0.jar:/usr/bin/../share/java/kafka/cloudevents-api-2.3.0.jar:/usr/bin/../share/java/kafka/cloudevents-core-2.3.0.jar:/usr/bin/../share/java/kafka/cloudevents-json-jackson-2.3.0.jar:/usr/bin/../share/java/kafka/cloudevents-kafka-2.3.0.jar:/usr/bin/../share/java/kafka/cloudevents-protobuf-2.3.0.jar:/usr/bin/../share/java/kafka/commons-cli-1.4.jar:/usr/bin/../share/java/kafka/commons-codec-1.15.jar:/usr/bin/../share/java/kafka/commons-collections4-4.4.jar:/usr/bin/../share/java/kafka/commons-compress-1.21.jar:/usr/bin/../share/java/kafka/commons-io-2.11.0.jar:/usr/bin/../share/java/kafka/commons-lang3-3.11.jar:/usr/bin/../share/java/kafka/commons-logging-1.2.jar:/usr/bin/../share/java/kafka/commons-math3-3.6.1.jar:/usr/bin/../share/java/kafka/confluent-audit-7.3.2-ce.jar:/usr/bin/../share/java/kafka/confluent-licensing-new-7.3.2-ce.jar:/usr/bin/../share/java/kafka/confluent-resource-names-7.3.2-ce.jar:/usr/bin/../share/java/kafka/confluent-serializers-new-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-api-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-basic-auth-extension-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-ce-logs-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-json-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-mirror-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-mirror-client-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-runtime-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connect-transforms-7.3.2-ce.jar:/usr/bin/../share/java/kafka/connector-datapreview-extension-7.3.2-ce.jar:/usr/bin/../share/java/kafka/content-type-2.1.jar:/usr/bin/../share/java/kafka/core-1.54.0.0.jar:/usr/bin/../share/java/kafka/database-2.1.4.jar:/usr/bin/../share/java/kafka/error_prone_annotations-2.3.4.jar:/usr/bin/../share/java/kafka/failureaccess-1.0.1.jar:/usr/bin/../share/java/kafka/flatbuffers-java-2.0.3.jar:/usr/bin/../share/java/kafka/gax-2.16.0.jar:/usr/bin/../share/java/kafka/gax-httpjson-0.101.0.jar:/usr/bin/../share/java/kafka/google-api-client-1.34.0.jar:/usr/bin/../share/java/kafka/google-api-services-cloudkms-v1-rev108-1.25.0.jar:/usr/bin/../share/java/kafka/google-api-services-storage-v1-rev20220401-1.32.1.jar:/usr/bin/../share/java/kafka/google-auth-library-credentials-1.6.0.jar:/usr/bin/../share/java/kafka/google-auth-library-oauth2-http-1.6.0.jar:/usr/bin/../share/java/kafka/google-cloud-core-2.6.0.jar:/usr/bin/../share/java/kafka/google-cloud-core-http-2.6.0.jar:/usr/bin/../share/java/kafka/google-cloud-storage-2.6.1.jar:/usr/bin/../share/java/kafka/google-http-client-1.41.7.jar:/usr/bin/../share/java/kafka/gson-2.9.0.jar:/usr/bin/../share/java/kafka/google-http-client-apache-v2-1.41.7.jar:/usr/bin/../share/java/kafka/google-http-client-appengine-1.41.7.jar:/usr/bin/../share/java/kafka/google-http-client-gson-1.41.7.jar:/usr/bin/../share/java/kafka/google-http-client-jackson2-1.41.7.jar:/usr/bin/../share/java/kafka/google-oauth-client-1.33.3.jar:/usr/bin/../share/java/kafka/grpc-context-1.45.1.jar:/usr/bin/../share/java/kafka/gson-fire-1.8.5.jar:/usr/bin/../share/java/kafka/guava-30.0-jre.jar:/usr/bin/../share/java/kafka/hk2-api-2.6.1.jar:/usr/bin/../share/java/kafka/hk2-locator-2.6.1.jar:/usr/bin/../share/java/kafka/hk2-utils-2.6.1.jar:/usr/bin/../share/java/kafka/httpclient-4.5.13.jar:/usr/bin/../share/java/kafka/httpcore-4.4.13.jar:/usr/bin/../share/java/kafka/httpcore-4.4.15.jar:/usr/bin/../share/java/kafka/internal-rest-server-7.3.2-ce.jar:/usr/bin/../share/java/kafka/ion-java-1.0.2.jar:/usr/bin/../share/java/kafka/j2objc-annotations-1.3.jar:/usr/bin/../share/java/kafka/jackson-annotations-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-core-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-databind-2.13.4.2.jar:/usr/bin/../share/java/kafka/jackson-dataformat-cbor-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-dataformat-csv-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-dataformat-properties-2.13.4.jar:/usr/bin/../share/java/kafka/jackson-dataformat-xml-2.13.4.jar:/usr/bin/../share/java/kafka/jackson-dataformat-yaml-2.13.4.jar:/usr/bin/../share/java/kafka/jackson-datatype-jdk8-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-datatype-jsr310-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-datatype-protobuf-0.9.11-jackson2.9.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-base-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.13.3.jar:/usr/bin/../share/java/kafka/jackson-module-scala_2.13-2.13.3.jar:/usr/bin/../share/java/kafka/jakarta.activation-api-1.2.2.jar:/usr/bin/../share/java/kafka/jakarta.annotation-api-1.3.5.jar:/usr/bin/../share/java/kafka/jakarta.inject-2.6.1.jar:/usr/bin/../share/java/kafka/jakarta.validation-api-2.0.2.jar:/usr/bin/../share/java/kafka/jakarta.ws.rs-api-2.1.6.jar:/usr/bin/../share/java/kafka/jakarta.xml.bind-api-2.3.3.jar:/usr/bin/../share/java/kafka/javassist-3.27.0-GA.jar:/usr/bin/../share/java/kafka/javax.annotation-api-1.3.2.jar:/usr/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/kafka/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/kafka/jaxb-api-2.3.0.jar:/usr/bin/../share/java/kafka/jbcrypt-0.4.jar:/usr/bin/../share/java/kafka/jcip-annotations-1.0-1.jar:/usr/bin/../share/java/kafka/jcip-annotations-1.0.jar:/usr/bin/../share/java/kafka/jersey-client-2.34.jar:/usr/bin/../share/java/kafka/jersey-common-2.34.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-2.34.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-core-2.34.jar:/usr/bin/../share/java/kafka/jersey-hk2-2.34.jar:/usr/bin/../share/java/kafka/jersey-server-2.34.jar:/usr/bin/../share/java/kafka/jetty-client-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-continuation-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-http-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-io-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-security-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-server-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-servlets-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-util-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jetty-util-ajax-9.4.48.v20220622.jar:/usr/bin/../share/java/kafka/jline-3.21.0.jar:/usr/bin/../share/java/kafka/jmespath-java-1.12.268.jar:/usr/bin/../share/java/kafka/jna-5.6.0.jar:/usr/bin/../share/java/kafka/jna-platform-5.6.0.jar:/usr/bin/../share/java/kafka/joda-time-2.9.9.jar:/usr/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/usr/bin/../share/java/kafka/jose4j-0.7.9.jar:/usr/bin/../share/java/kafka/json-smart-2.4.7.jar:/usr/bin/../share/java/kafka/jsr305-3.0.1.jar:/usr/bin/../share/java/kafka/jsr305-3.0.2.jar:/usr/bin/../share/java/kafka/kafka-client-plugins-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-log4j-appender-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-metadata-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-raft-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-server-common-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-shell-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-storage-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-storage-api-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-streams-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-streams-examples-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-streams-scala_2.13-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-streams-test-utils-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka-tools-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kafka.jar:/usr/bin/../share/java/kafka/kafka_2.13-7.3.2-ce.jar:/usr/bin/../share/java/kafka/kotlin-stdlib-1.6.0.jar:/usr/bin/../share/java/kafka/kotlin-stdlib-common-1.6.0.jar:/usr/bin/../share/java/kafka/kotlin-stdlib-jdk7-1.6.0.jar:/usr/bin/../share/java/kafka/kotlin-stdlib-jdk8-1.6.0.jar:/usr/bin/../share/java/kafka/lang-tag-1.5.jar:/usr/bin/../share/java/kafka/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/kafka/logging-interceptor-4.9.1.jar:/usr/bin/../share/java/kafka/lz4-java-1.8.0.jar:/usr/bin/../share/java/kafka/maven-artifact-3.8.4.jar:/usr/bin/../share/java/kafka/metrics-core-2.2.0.jar:/usr/bin/../share/java/kafka/metrics-core-4.1.12.1.jar:/usr/bin/../share/java/kafka/msal4j-1.10.1.jar:/usr/bin/../share/java/kafka/msal4j-persistence-extension-1.1.0.jar:/usr/bin/../share/java/kafka/netty-all-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-buffer-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-dns-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-haproxy-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-http-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-http2-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-memcache-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-mqtt-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-redis-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-smtp-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-socks-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-stomp-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-codec-xml-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-common-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-handler-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-handler-proxy-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-resolver-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-resolver-dns-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-resolver-dns-classes-macos-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-resolver-dns-native-macos-4.1.79.Final-osx-aarch_64.jar:/usr/bin/../share/java/kafka/netty-resolver-dns-native-macos-4.1.79.Final-osx-x86_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-linux-aarch_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-linux-x86_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-osx-aarch_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-osx-x86_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final-windows-x86_64.jar:/usr/bin/../share/java/kafka/netty-tcnative-boringssl-static-2.0.53.Final.jar:/usr/bin/../share/java/kafka/netty-tcnative-classes-2.0.53.Final.jar:/usr/bin/../share/java/kafka/netty-transport-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-classes-epoll-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-classes-kqueue-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-native-epoll-4.1.79.Final-linux-aarch_64.jar:/usr/bin/../share/java/kafka/netty-transport-native-epoll-4.1.79.Final-linux-x86_64.jar:/usr/bin/../share/java/kafka/netty-transport-native-epoll-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-native-kqueue-4.1.79.Final-osx-aarch_64.jar:/usr/bin/../share/java/kafka/netty-transport-native-kqueue-4.1.79.Final-osx-x86_64.jar:/usr/bin/../share/java/kafka/netty-transport-native-unix-common-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-rxtx-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-sctp-4.1.79.Final.jar:/usr/bin/../share/java/kafka/netty-transport-udt-4.1.79.Final.jar:/usr/bin/../share/java/kafka/nimbus-jose-jwt-9.9.3.jar:/usr/bin/../share/java/kafka/oauth2-oidc-sdk-9.7.jar:/usr/bin/../share/java/kafka/okhttp-4.9.3.jar:/usr/bin/../share/java/kafka/okio-jvm-3.0.0.jar:/usr/bin/../share/java/kafka/opencensus-api-0.31.0.jar:/usr/bin/../share/java/kafka/opencensus-contrib-http-util-0.31.0.jar:/usr/bin/../share/java/kafka/opencensus-proto-0.2.0.jar:/usr/bin/../share/java/kafka/osgi-resource-locator-1.0.3.jar:/usr/bin/../share/java/kafka/paranamer-2.8.jar:/usr/bin/../share/java/kafka/plexus-utils-3.3.0.jar:/usr/bin/../share/java/kafka/proto-google-common-protos-2.8.3.jar:/usr/bin/../share/java/kafka/proto-google-iam-v1-1.3.1.jar:/usr/bin/../share/java/kafka/protobuf-java-3.19.6.jar:/usr/bin/../share/java/kafka/protobuf-java-util-3.19.6.jar:/usr/bin/../share/java/kafka/rbac-7.3.2-ce.jar:/usr/bin/../share/java/kafka/reactive-streams-1.0.3.jar:/usr/bin/../share/java/kafka/reactor-core-3.4.6.jar:/usr/bin/../share/java/kafka/reactor-netty-1.0.7.jar:/usr/bin/../share/java/kafka/reactor-netty-core-1.0.7.jar:/usr/bin/../share/java/kafka/reactor-netty-http-1.0.7.jar:/usr/bin/../share/java/kafka/reactor-netty-http-brave-1.0.7.jar:/usr/bin/../share/java/kafka/reflections-0.9.12.jar:/usr/bin/../share/java/kafka/reload4j-1.2.19.jar:/usr/bin/../share/java/kafka/rest-authorizer-7.3.2-ce.jar:/usr/bin/../share/java/kafka/rocksdbjni-6.29.4.1.jar:/usr/bin/../share/java/kafka/scala-collection-compat_2.13-2.6.0.jar:/usr/bin/../share/java/kafka/scala-java8-compat_2.13-1.0.2.jar:/usr/bin/../share/java/kafka/scala-library-2.13.10.jar:/usr/bin/../share/java/kafka/scala-logging_2.13-3.9.4.jar:/usr/bin/../share/java/kafka/scala-reflect-2.13.10.jar:/usr/bin/../share/java/kafka/simpleclient-0.12.0.jar:/usr/bin/../share/java/kafka/security-extensions-7.3.2-ce.jar:/usr/bin/../share/java/kafka/simpleclient_common-0.12.0.jar:/usr/bin/../share/java/kafka/simpleclient_httpserver-0.12.0.jar:/usr/bin/../share/java/kafka/simpleclient_tracer_common-0.12.0.jar:/usr/bin/../share/java/kafka/simpleclient_tracer_otel-0.12.0.jar:/usr/bin/../share/java/kafka/simpleclient_tracer_otel_agent-0.12.0.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.21.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.30.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.36.jar:/usr/bin/../share/java/kafka/slf4j-reload4j-1.7.36.jar:/usr/bin/../share/java/kafka/snakeyaml-1.32.jar:/usr/bin/../share/java/kafka/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/kafka/stax2-api-4.2.1.jar:/usr/bin/../share/java/kafka/swagger-annotations-1.6.3.jar:/usr/bin/../share/java/kafka/swagger-annotations-2.2.0.jar:/usr/bin/../share/java/kafka/swagger-core-2.2.0.jar:/usr/bin/../share/java/kafka/swagger-integration-2.2.0.jar:/usr/bin/../share/java/kafka/swagger-jaxrs2-2.2.0.jar:/usr/bin/../share/java/kafka/swagger-models-2.2.0.jar:/usr/bin/../share/java/kafka/telemetry-api-3.282.0.jar:/usr/bin/../share/java/kafka/telemetry-client-3.282.0.jar:/usr/bin/../share/java/kafka/telemetry-events-7.3.2-ce.jar:/usr/bin/../share/java/kafka/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/kafka/threetenbp-1.6.0.jar:/usr/bin/../share/java/kafka/tink-1.6.0.jar:/usr/bin/../share/java/kafka/tink-gcpkms-1.6.0.jar:/usr/bin/../share/java/kafka/trogdor-7.3.2-ce.jar:/usr/bin/../share/java/kafka/woodstox-core-6.3.1.jar:/usr/bin/../share/java/kafka/zipkin-2.23.2.jar:/usr/bin/../share/java/kafka/zipkin-reporter-2.16.3.jar:/usr/bin/../share/java/kafka/zipkin-reporter-brave-2.16.3.jar:/usr/bin/../share/java/kafka/zookeeper-3.6.3.jar:/usr/bin/../share/java/kafka/zookeeper-jute-3.6.3.jar:/usr/bin/../share/java/kafka/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/confluent-metadata-service/agrona-1.15.2.jar:/usr/bin/../share/java/confluent-metadata-service/annotations-3.0.1.jar:/usr/bin/../share/java/confluent-metadata-service/argparse4j-0.8.1.jar:/usr/bin/../share/java/confluent-metadata-service/auth-metadata-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/auth-providers-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/authorizer-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/authorizer-client-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/auto-value-annotations-1.8.1.jar:/usr/bin/../share/java/confluent-metadata-service/bc-fips-1.0.2.3.jar:/usr/bin/../share/java/confluent-metadata-service/bcpkix-fips-1.0.6.jar:/usr/bin/../share/java/confluent-metadata-service/bctls-fips-1.0.13.jar:/usr/bin/../share/java/confluent-metadata-service/ce-kafka-http-server-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/cel-core-0.3.5.jar:/usr/bin/../share/java/confluent-metadata-service/cel-generated-antlr-0.3.5.jar:/usr/bin/../share/java/confluent-metadata-service/cel-generated-pb-0.3.5.jar:/usr/bin/../share/java/confluent-metadata-service/cloudevents-api-2.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/cloudevents-core-2.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/cloudevents-json-jackson-2.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/cloudevents-kafka-2.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/cloudevents-protobuf-2.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/common-utils-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/commons-lang3-3.12.0.jar:/usr/bin/../share/java/confluent-metadata-service/concurrent-trees-2.6.1.jar:/usr/bin/../share/java/confluent-metadata-service/confluent-audit-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/confluent-resource-names-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/confluent-security-plugins-common-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/connect-api-7.3.2-ccs.jar:/usr/bin/../share/java/confluent-metadata-service/connect-json-7.3.2-ccs.jar:/usr/bin/../share/java/confluent-metadata-service/connect-runtime-7.3.2-ccs.jar:/usr/bin/../share/java/confluent-metadata-service/connect-transforms-7.3.2-ccs.jar:/usr/bin/../share/java/confluent-metadata-service/gson-2.9.0.jar:/usr/bin/../share/java/confluent-metadata-service/jackson-core-2.13.4.jar:/usr/bin/../share/java/confluent-metadata-service/jackson-dataformat-yaml-2.13.4.jar:/usr/bin/../share/java/confluent-metadata-service/jackson-datatype-jdk8-2.13.4.jar:/usr/bin/../share/java/confluent-metadata-service/jackson-datatype-jsr310-2.13.4.jar:/usr/bin/../share/java/confluent-metadata-service/jackson-datatype-protobuf-0.9.11-jackson2.9.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.annotation-api-1.3.5.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.el-3.0.4.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.el-api-4.0.0.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.inject-2.6.1.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.validation-api-2.0.2.jar:/usr/bin/../share/java/confluent-metadata-service/jakarta.ws.rs-api-2.1.6.jar:/usr/bin/../share/java/confluent-metadata-service/javax.annotation-api-1.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/javax.servlet-api-4.0.1.jar:/usr/bin/../share/java/confluent-metadata-service/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/confluent-metadata-service/jbcrypt-0.4.jar:/usr/bin/../share/java/confluent-metadata-service/jersey-bean-validation-2.36.jar:/usr/bin/../share/java/confluent-metadata-service/jersey-client-2.36.jar:/usr/bin/../share/java/confluent-metadata-service/jersey-common-2.36.jar:/usr/bin/../share/java/confluent-metadata-service/jersey-server-2.36.jar:/usr/bin/../share/java/confluent-metadata-service/jetty-client-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-metadata-service/jetty-proxy-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-metadata-service/jetty-security-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-metadata-service/jetty-util-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-metadata-service/jose4j-0.7.2.jar:/usr/bin/../share/java/confluent-metadata-service/jul-to-slf4j-1.7.36.jar:/usr/bin/../share/java/confluent-metadata-service/kafka-client-plugins-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/kafka-log4j-appender-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/kafka-server-common-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/kafka-tools-7.3.2-ccs.jar:/usr/bin/../share/java/confluent-metadata-service/lz4-java-1.8.0.jar:/usr/bin/../share/java/confluent-metadata-service/maven-artifact-3.8.4.jar:/usr/bin/../share/java/confluent-metadata-service/metrics-core-2.2.0.jar:/usr/bin/../share/java/confluent-metadata-service/netty-buffer-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-codec-http-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-codec-socks-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-common-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-handler-proxy-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-classes-epoll-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-classes-kqueue-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-native-epoll-4.1.79.Final-linux-x86_64.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-native-kqueue-4.1.79.Final-osx-x86_64.jar:/usr/bin/../share/java/confluent-metadata-service/netty-transport-native-unix-common-4.1.79.Final.jar:/usr/bin/../share/java/confluent-metadata-service/osgi-resource-locator-1.0.3.jar:/usr/bin/../share/java/confluent-metadata-service/plexus-utils-3.3.0.jar:/usr/bin/../share/java/confluent-metadata-service/protobuf-java-3.19.6.jar:/usr/bin/../share/java/confluent-metadata-service/protobuf-java-util-3.19.6.jar:/usr/bin/../share/java/confluent-metadata-service/rbac-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/rbac-api-server-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/rbac-common-7.3.2.jar:/usr/bin/../share/java/confluent-metadata-service/reflections-0.9.12.jar:/usr/bin/../share/java/confluent-metadata-service/reload4j-1.2.19.jar:/usr/bin/../share/java/confluent-metadata-service/rest-authorizer-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/security-extensions-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/snakeyaml-1.32.jar:/usr/bin/../share/java/confluent-metadata-service/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/confluent-metadata-service/swagger-annotations-2.2.0.jar:/usr/bin/../share/java/confluent-metadata-service/telemetry-api-3.282.0.jar:/usr/bin/../share/java/confluent-metadata-service/telemetry-client-3.282.0.jar:/usr/bin/../share/java/confluent-metadata-service/telemetry-events-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/confluent-metadata-service/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/rest-utils/activation-1.1.1.jar:/usr/bin/../share/java/rest-utils/aopalliance-repackaged-2.6.1.jar:/usr/bin/../share/java/rest-utils/asm-9.3.jar:/usr/bin/../share/java/rest-utils/asm-analysis-9.3.jar:/usr/bin/../share/java/rest-utils/asm-commons-9.3.jar:/usr/bin/../share/java/rest-utils/asm-tree-9.3.jar:/usr/bin/../share/java/rest-utils/checker-qual-3.8.0.jar:/usr/bin/../share/java/rest-utils/classmate-1.3.4.jar:/usr/bin/../share/java/rest-utils/error_prone_annotations-2.5.1.jar:/usr/bin/../share/java/rest-utils/failureaccess-1.0.1.jar:/usr/bin/../share/java/rest-utils/guava-30.1.1-jre.jar:/usr/bin/../share/java/rest-utils/hibernate-validator-6.1.7.Final.jar:/usr/bin/../share/java/rest-utils/hk2-api-2.6.1.jar:/usr/bin/../share/java/rest-utils/hk2-locator-2.6.1.jar:/usr/bin/../share/java/rest-utils/hk2-utils-2.6.1.jar:/usr/bin/../share/java/rest-utils/http2-common-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/http2-hpack-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/http2-server-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/j2objc-annotations-1.3.jar:/usr/bin/../share/java/rest-utils/jackson-annotations-2.13.4.jar:/usr/bin/../share/java/rest-utils/jackson-core-2.13.4.jar:/usr/bin/../share/java/rest-utils/jackson-databind-2.13.4.2.jar:/usr/bin/../share/java/rest-utils/jackson-jaxrs-base-2.13.4.jar:/usr/bin/../share/java/rest-utils/jackson-jaxrs-json-provider-2.13.4.jar:/usr/bin/../share/java/rest-utils/jackson-module-jaxb-annotations-2.13.4.jar:/usr/bin/../share/java/rest-utils/jakarta.activation-api-1.2.1.jar:/usr/bin/../share/java/rest-utils/jakarta.annotation-api-1.3.5.jar:/usr/bin/../share/java/rest-utils/jakarta.el-3.0.4.jar:/usr/bin/../share/java/rest-utils/jakarta.el-api-4.0.0.jar:/usr/bin/../share/java/rest-utils/jakarta.inject-2.6.1.jar:/usr/bin/../share/java/rest-utils/jakarta.validation-api-2.0.2.jar:/usr/bin/../share/java/rest-utils/jakarta.ws.rs-api-2.1.6.jar:/usr/bin/../share/java/rest-utils/jakarta.xml.bind-api-2.3.3.jar:/usr/bin/../share/java/rest-utils/javassist-3.25.0-GA.jar:/usr/bin/../share/java/rest-utils/javax-websocket-client-impl-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/javax-websocket-server-impl-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/javax.annotation-api-1.3.2.jar:/usr/bin/../share/java/rest-utils/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/rest-utils/javax.websocket-api-1.0.jar:/usr/bin/../share/java/rest-utils/javax.websocket-client-api-1.0.jar:/usr/bin/../share/java/rest-utils/jaxb-api-2.3.0.jar:/usr/bin/../share/java/rest-utils/jboss-logging-3.3.2.Final.jar:/usr/bin/../share/java/rest-utils/jersey-bean-validation-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-client-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-common-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-container-servlet-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-container-servlet-core-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-hk2-2.36.jar:/usr/bin/../share/java/rest-utils/jersey-server-2.36.jar:/usr/bin/../share/java/rest-utils/jetty-alpn-java-server-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-alpn-server-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-annotations-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-client-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-continuation-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-http-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-io-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-jaas-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-jmx-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-jndi-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-plus-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-security-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-server-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-servlets-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-util-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-util-ajax-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-webapp-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jetty-xml-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/jsr305-3.0.2.jar:/usr/bin/../share/java/rest-utils/kafka-clients-7.3.2-ccs.jar:/usr/bin/../share/java/rest-utils/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/rest-utils/lz4-java-1.8.0.jar:/usr/bin/../share/java/rest-utils/osgi-resource-locator-1.0.3.jar:/usr/bin/../share/java/rest-utils/rest-utils-7.3.2.jar:/usr/bin/../share/java/rest-utils/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/rest-utils/websocket-api-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/websocket-client-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/websocket-common-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/websocket-server-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/websocket-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/rest-utils/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/confluent-common/build-tools-7.3.2.jar:/usr/bin/../share/java/confluent-common/common-config-7.3.2.jar:/usr/bin/../share/java/confluent-common/common-metrics-7.3.2.jar:/usr/bin/../share/java/confluent-common/common-utils-7.3.2.jar:/usr/bin/../share/java/confluent-common/slf4j-api-1.7.36.jar:/usr/bin/../share/java/ce-kafka-http-server/activation-1.1.1.jar:/usr/bin/../share/java/ce-kafka-http-server/aopalliance-repackaged-2.6.1.jar:/usr/bin/../share/java/ce-kafka-http-server/asm-9.3.jar:/usr/bin/../share/java/ce-kafka-http-server/asm-analysis-9.3.jar:/usr/bin/../share/java/ce-kafka-http-server/asm-commons-9.3.jar:/usr/bin/../share/java/ce-kafka-http-server/asm-tree-9.3.jar:/usr/bin/../share/java/ce-kafka-http-server/ce-kafka-http-server-7.3.2.jar:/usr/bin/../share/java/ce-kafka-http-server/checker-qual-3.8.0.jar:/usr/bin/../share/java/ce-kafka-http-server/classmate-1.3.4.jar:/usr/bin/../share/java/ce-kafka-http-server/common-utils-7.3.2.jar:/usr/bin/../share/java/ce-kafka-http-server/error_prone_annotations-2.5.1.jar:/usr/bin/../share/java/ce-kafka-http-server/failureaccess-1.0.1.jar:/usr/bin/../share/java/ce-kafka-http-server/guava-30.1.1-jre.jar:/usr/bin/../share/java/ce-kafka-http-server/hibernate-validator-6.1.7.Final.jar:/usr/bin/../share/java/ce-kafka-http-server/hk2-api-2.6.1.jar:/usr/bin/../share/java/ce-kafka-http-server/hk2-locator-2.6.1.jar:/usr/bin/../share/java/ce-kafka-http-server/hk2-utils-2.6.1.jar:/usr/bin/../share/java/ce-kafka-http-server/http2-common-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/http2-hpack-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/http2-server-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/j2objc-annotations-1.3.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-annotations-2.13.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-core-2.13.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-databind-2.13.4.2.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-jaxrs-base-2.13.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-jaxrs-json-provider-2.13.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jackson-module-jaxb-annotations-2.13.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.activation-api-1.2.1.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.annotation-api-1.3.5.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.el-3.0.4.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.el-api-4.0.0.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.inject-2.6.1.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.validation-api-2.0.2.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.ws.rs-api-2.1.6.jar:/usr/bin/../share/java/ce-kafka-http-server/jakarta.xml.bind-api-2.3.3.jar:/usr/bin/../share/java/ce-kafka-http-server/javassist-3.25.0-GA.jar:/usr/bin/../share/java/ce-kafka-http-server/javax-websocket-client-impl-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/javax-websocket-server-impl-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/javax.annotation-api-1.3.2.jar:/usr/bin/../share/java/ce-kafka-http-server/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/ce-kafka-http-server/javax.websocket-api-1.0.jar:/usr/bin/../share/java/ce-kafka-http-server/javax.websocket-client-api-1.0.jar:/usr/bin/../share/java/ce-kafka-http-server/jaxb-api-2.3.0.jar:/usr/bin/../share/java/ce-kafka-http-server/jboss-logging-3.3.2.Final.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-bean-validation-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-client-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-common-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-container-servlet-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-container-servlet-core-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-hk2-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jersey-server-2.36.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-alpn-java-server-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-alpn-server-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-annotations-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-client-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-continuation-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-http-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-io-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-jaas-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-jmx-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-jndi-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-plus-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-security-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-server-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-servlets-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-util-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-util-ajax-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-webapp-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jetty-xml-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/jsr305-3.0.2.jar:/usr/bin/../share/java/ce-kafka-http-server/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/ce-kafka-http-server/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/ce-kafka-http-server/lz4-java-1.8.0.jar:/usr/bin/../share/java/ce-kafka-http-server/osgi-resource-locator-1.0.3.jar:/usr/bin/../share/java/ce-kafka-http-server/rest-utils-7.3.2.jar:/usr/bin/../share/java/ce-kafka-http-server/slf4j-api-1.7.36.jar:/usr/bin/../share/java/ce-kafka-http-server/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/ce-kafka-http-server/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/ce-kafka-http-server/websocket-api-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/websocket-client-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/websocket-common-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/websocket-server-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/websocket-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/ce-kafka-http-server/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/ce-kafka-rest-servlet/ce-kafka-rest-servlet-7.3.2.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/ce-kafka-rest-extensions-7.3.2.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/common-utils-7.3.2.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/lz4-java-1.8.0.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/slf4j-api-1.7.36.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/ce-kafka-rest-extensions/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/kafka-rest-lib/annotations-13.0.jar:/usr/bin/../share/java/kafka-rest-lib/argparse4j-0.7.0.jar:/usr/bin/../share/java/kafka-rest-lib/audience-annotations-0.5.0.jar:/usr/bin/../share/java/kafka-rest-lib/auto-value-annotations-1.7.2.jar:/usr/bin/../share/java/kafka-rest-lib/avro-1.11.0.jar:/usr/bin/../share/java/kafka-rest-lib/checker-qual-3.8.0.jar:/usr/bin/../share/java/kafka-rest-lib/classgraph-4.8.21.jar:/usr/bin/../share/java/kafka-rest-lib/common-config-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/common-utils-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/commons-cli-1.4.jar:/usr/bin/../share/java/kafka-rest-lib/commons-collections-3.2.2.jar:/usr/bin/../share/java/kafka-rest-lib/commons-compress-1.21.jar:/usr/bin/../share/java/kafka-rest-lib/commons-digester-2.1.jar:/usr/bin/../share/java/kafka-rest-lib/commons-lang3-3.12.0.jar:/usr/bin/../share/java/kafka-rest-lib/commons-logging-1.2.jar:/usr/bin/../share/java/kafka-rest-lib/commons-validator-1.7.jar:/usr/bin/../share/java/kafka-rest-lib/error_prone_annotations-2.5.1.jar:/usr/bin/../share/java/kafka-rest-lib/everit-json-schema-1.14.1.jar:/usr/bin/../share/java/kafka-rest-lib/failureaccess-1.0.1.jar:/usr/bin/../share/java/kafka-rest-lib/gson-2.9.0.jar:/usr/bin/../share/java/kafka-rest-lib/guava-30.1.1-jre.jar:/usr/bin/../share/java/kafka-rest-lib/handy-uri-templates-2.1.8.jar:/usr/bin/../share/java/kafka-rest-lib/j2objc-annotations-1.3.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-core-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-dataformat-csv-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-datatype-guava-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-datatype-jdk8-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-datatype-joda-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-datatype-jsr310-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-module-parameter-names-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/jackson-module-scala_2.13-2.13.4.jar:/usr/bin/../share/java/kafka-rest-lib/javapoet-1.13.0.jar:/usr/bin/../share/java/kafka-rest-lib/joda-time-2.10.8.jar:/usr/bin/../share/java/kafka-rest-lib/jopt-simple-5.0.4.jar:/usr/bin/../share/java/kafka-rest-lib/jose4j-0.7.9.jar:/usr/bin/../share/java/kafka-rest-lib/json-20220320.jar:/usr/bin/../share/java/kafka-rest-lib/jsr305-3.0.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-avro-serializer-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-clients-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-json-schema-provider-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-json-schema-serializer-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-json-serializer-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-metadata-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-protobuf-provider-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-protobuf-serializer-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-protobuf-types-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-raft-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-rest-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-schema-registry-client-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-schema-serializer-7.3.2.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-server-common-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-storage-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka-storage-api-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kafka_2.13-7.3.2-ccs.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-reflect-1.7.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-script-runtime-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-scripting-common-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-scripting-compiler-embeddable-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-scripting-compiler-impl-embeddable-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-scripting-jvm-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-stdlib-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-stdlib-common-1.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-stdlib-jdk7-1.6.10.jar:/usr/bin/../share/java/kafka-rest-lib/kotlin-stdlib-jdk8-1.6.10.jar:/usr/bin/../share/java/kafka-rest-lib/kotlinpoet-1.12.0.jar:/usr/bin/../share/java/kafka-rest-lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/kafka-rest-lib/logredactor-1.0.10.jar:/usr/bin/../share/java/kafka-rest-lib/logredactor-metrics-1.0.10.jar:/usr/bin/../share/java/kafka-rest-lib/lz4-java-1.8.0.jar:/usr/bin/../share/java/kafka-rest-lib/mbknor-jackson-jsonschema_2.13-1.0.39.jar:/usr/bin/../share/java/kafka-rest-lib/metrics-core-2.2.0.jar:/usr/bin/../share/java/kafka-rest-lib/metrics-core-4.1.12.1.jar:/usr/bin/../share/java/kafka-rest-lib/minimal-json-0.9.5.jar:/usr/bin/../share/java/kafka-rest-lib/okio-3.0.0.jar:/usr/bin/../share/java/kafka-rest-lib/okio-jvm-3.0.0.jar:/usr/bin/../share/java/kafka-rest-lib/paranamer-2.8.jar:/usr/bin/../share/java/kafka-rest-lib/proto-google-common-protos-2.5.1.jar:/usr/bin/../share/java/kafka-rest-lib/protobuf-java-3.19.6.jar:/usr/bin/../share/java/kafka-rest-lib/protobuf-java-util-3.19.6.jar:/usr/bin/../share/java/kafka-rest-lib/re2j-1.6.jar:/usr/bin/../share/java/kafka-rest-lib/resilience4j-core-1.7.1.jar:/usr/bin/../share/java/kafka-rest-lib/resilience4j-ratelimiter-1.7.1.jar:/usr/bin/../share/java/kafka-rest-lib/scala-collection-compat_2.13-2.6.0.jar:/usr/bin/../share/java/kafka-rest-lib/scala-java8-compat_2.13-1.0.2.jar:/usr/bin/../share/java/kafka-rest-lib/scala-library-2.13.10.jar:/usr/bin/../share/java/kafka-rest-lib/scala-logging_2.13-3.9.4.jar:/usr/bin/../share/java/kafka-rest-lib/scala-reflect-2.13.10.jar:/usr/bin/../share/java/kafka-rest-lib/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/kafka-rest-lib/spotbugs-annotations-4.7.1.jar:/usr/bin/../share/java/kafka-rest-lib/swagger-annotations-2.1.10.jar:/usr/bin/../share/java/kafka-rest-lib/validation-api-2.0.1.Final.jar:/usr/bin/../share/java/kafka-rest-lib/vavr-0.10.2.jar:/usr/bin/../share/java/kafka-rest-lib/vavr-match-0.10.2.jar:/usr/bin/../share/java/kafka-rest-lib/wire-runtime-jvm-4.4.3.jar:/usr/bin/../share/java/kafka-rest-lib/wire-schema-jvm-4.4.3.jar:/usr/bin/../share/java/kafka-rest-lib/zookeeper-3.6.3.jar:/usr/bin/../share/java/kafka-rest-lib/zookeeper-jute-3.6.3.jar:/usr/bin/../share/java/kafka-rest-lib/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/activation-1.1.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/agrona-1.15.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/annotations-3.0.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/aopalliance-repackaged-2.6.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/argparse4j-0.7.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/auth-metadata-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/authorizer-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/avro-1.11.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/bc-fips-1.0.2.3.jar:/usr/bin/../share/java/confluent-security/kafka-rest/bcpkix-fips-1.0.6.jar:/usr/bin/../share/java/confluent-security/kafka-rest/bctls-fips-1.0.13.jar:/usr/bin/../share/java/confluent-security/kafka-rest/broker-plugins-7.3.2-ce-test.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cel-core-0.3.5.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cel-generated-antlr-0.3.5.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cel-generated-pb-0.3.5.jar:/usr/bin/../share/java/confluent-security/kafka-rest/checker-qual-3.8.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/classgraph-4.8.138.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cloudevents-api-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cloudevents-core-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cloudevents-json-jackson-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cloudevents-kafka-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/cloudevents-protobuf-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/commons-compress-1.21.jar:/usr/bin/../share/java/confluent-security/kafka-rest/commons-lang3-3.12.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/confluent-kafka-rest-security-plugin-7.3.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/confluent-licensing-new-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/confluent-resource-names-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/confluent-security-plugins-common-7.3.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/confluent-serializers-new-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/connect-api-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/connect-ce-logs-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/connect-json-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/connect-runtime-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/connect-transforms-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/error_prone_annotations-2.5.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/failureaccess-1.0.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/gson-2.9.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/guava-30.1.1-jre.jar:/usr/bin/../share/java/confluent-security/kafka-rest/hk2-api-2.6.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/hk2-locator-2.6.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/hk2-utils-2.6.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/j2objc-annotations-1.3.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-annotations-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-core-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-databind-2.13.4.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-dataformat-properties-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-dataformat-yaml-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-datatype-jdk8-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-datatype-jsr310-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-datatype-protobuf-0.9.11-jackson2.9.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-jaxrs-base-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-jaxrs-json-provider-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jackson-module-jaxb-annotations-2.13.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.activation-api-1.2.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.annotation-api-1.3.5.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.inject-2.6.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.validation-api-2.0.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.ws.rs-api-2.1.6.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jakarta.xml.bind-api-2.3.3.jar:/usr/bin/../share/java/confluent-security/kafka-rest/javassist-3.25.0-GA.jar:/usr/bin/../share/java/confluent-security/kafka-rest/javax.annotation-api-1.3.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/javax.servlet-api-4.0.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/javax.ws.rs-api-2.1.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jaxb-api-2.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jbcrypt-0.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-client-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-common-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-container-servlet-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-container-servlet-core-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-hk2-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jersey-server-2.36.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-client-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-continuation-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-http-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-io-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-security-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-server-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-servlet-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-servlets-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-util-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jetty-util-ajax-9.4.48.v20220622.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jose4j-0.7.9.jar:/usr/bin/../share/java/confluent-security/kafka-rest/jsr305-3.0.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-client-plugins-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-log4j-appender-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-schema-registry-client-7.3.2.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-server-common-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/kafka-tools-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/confluent-security/kafka-rest/lz4-java-1.8.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/maven-artifact-3.8.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/metrics-core-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/metrics-core-4.1.12.1.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-buffer-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-codec-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-codec-http-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-codec-socks-4.1.78.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-common-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-handler-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-handler-proxy-4.1.78.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-resolver-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-classes-epoll-4.1.78.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-classes-kqueue-4.1.78.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-native-epoll-4.1.78.Final-linux-x86_64.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-native-kqueue-4.1.78.Final-osx-x86_64.jar:/usr/bin/../share/java/confluent-security/kafka-rest/netty-transport-native-unix-common-4.1.79.Final.jar:/usr/bin/../share/java/confluent-security/kafka-rest/osgi-resource-locator-1.0.3.jar:/usr/bin/../share/java/confluent-security/kafka-rest/plexus-utils-3.3.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/protobuf-java-3.19.6.jar:/usr/bin/../share/java/confluent-security/kafka-rest/protobuf-java-util-3.19.6.jar:/usr/bin/../share/java/confluent-security/kafka-rest/rbac-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/reflections-0.9.12.jar:/usr/bin/../share/java/confluent-security/kafka-rest/rest-authorizer-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/security-extensions-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/snakeyaml-1.31.jar:/usr/bin/../share/java/confluent-security/kafka-rest/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/confluent-security/kafka-rest/swagger-annotations-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/swagger-core-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/swagger-integration-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/swagger-jaxrs2-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/swagger-models-2.2.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/telemetry-api-3.282.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/telemetry-client-3.282.0.jar:/usr/bin/../share/java/confluent-security/kafka-rest/telemetry-events-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/kafka-rest/zstd-jni-1.5.2-1.jar:/usr/bin/../share/java/confluent-security/schema-validator/annotations-13.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/avro-1.11.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/bc-fips-1.0.2.3.jar:/usr/bin/../share/java/confluent-security/schema-validator/bcpkix-fips-1.0.6.jar:/usr/bin/../share/java/confluent-security/schema-validator/bctls-fips-1.0.13.jar:/usr/bin/../share/java/confluent-security/schema-validator/caffeine-2.8.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/checker-qual-3.8.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/classgraph-4.8.21.jar:/usr/bin/../share/java/confluent-security/schema-validator/common-utils-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-collections-3.2.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-compress-1.21.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-digester-2.1.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-lang3-3.12.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-logging-1.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/commons-validator-1.7.jar:/usr/bin/../share/java/confluent-security/schema-validator/confluent-schema-registry-validator-plugin-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/error_prone_annotations-2.5.1.jar:/usr/bin/../share/java/confluent-security/schema-validator/everit-json-schema-1.14.1.jar:/usr/bin/../share/java/confluent-security/schema-validator/failureaccess-1.0.1.jar:/usr/bin/../share/java/confluent-security/schema-validator/gson-2.9.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/guava-30.1.1-jre.jar:/usr/bin/../share/java/confluent-security/schema-validator/handy-uri-templates-2.1.8.jar:/usr/bin/../share/java/confluent-security/schema-validator/j2objc-annotations-1.3.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-annotations-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-core-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-databind-2.13.4.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-datatype-guava-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-datatype-jdk8-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-datatype-joda-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-datatype-jsr310-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/jackson-module-parameter-names-2.13.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/javapoet-1.13.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/joda-time-2.10.8.jar:/usr/bin/../share/java/confluent-security/schema-validator/json-20220320.jar:/usr/bin/../share/java/confluent-security/schema-validator/jsr305-3.0.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-avro-serializer-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-clients-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-json-schema-provider-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-protobuf-provider-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-protobuf-types-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-schema-registry-client-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kafka-schema-serializer-7.3.2.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-reflect-1.7.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-script-runtime-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-scripting-common-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-scripting-compiler-embeddable-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-scripting-compiler-impl-embeddable-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-scripting-jvm-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-stdlib-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-stdlib-common-1.6.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-stdlib-jdk7-1.6.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlin-stdlib-jdk8-1.6.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/kotlinpoet-1.12.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/bin/../share/java/confluent-security/schema-validator/logredactor-1.0.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/logredactor-metrics-1.0.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/lz4-java-1.8.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/mbknor-jackson-jsonschema_2.13-1.0.39.jar:/usr/bin/../share/java/confluent-security/schema-validator/minimal-json-0.9.5.jar:/usr/bin/../share/java/confluent-security/schema-validator/okio-3.0.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/okio-jvm-3.0.0.jar:/usr/bin/../share/java/confluent-security/schema-validator/proto-google-common-protos-2.5.1.jar:/usr/bin/../share/java/confluent-security/schema-validator/protobuf-java-3.19.6.jar:/usr/bin/../share/java/confluent-security/schema-validator/protobuf-java-util-3.19.6.jar:/usr/bin/../share/java/confluent-security/schema-validator/re2j-1.6.jar:/usr/bin/../share/java/confluent-security/schema-validator/scala-library-2.13.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/snappy-java-1.1.8.4.jar:/usr/bin/../share/java/confluent-security/schema-validator/swagger-annotations-2.1.10.jar:/usr/bin/../share/java/confluent-security/schema-validator/telemetry-events-api-7.3.2-ce.jar:/usr/bin/../share/java/confluent-security/schema-validator/validation-api-2.0.1.Final.jar:/usr/bin/../share/java/confluent-security/schema-validator/wire-runtime-jvm-4.4.3.jar:/usr/bin/../share/java/confluent-security/schema-validator/wire-schema-jvm-4.4.3.jar:/usr/bin/../share/java/confluent-security/schema-validator/zstd-jni-1.5.2-1.jar:/usr/bin/../support-metrics-client/build/dependant-libs-2.13.10/*:/usr/bin/../support-metrics-client/build/libs/*:/usr/bin/../share/java/confluent-telemetry/confluent-metrics-7.3.2-ce.jar:/usr/share/java/support-metrics-client/*
os.spec = Linux, amd64, 5.10.184-175.749.amzn2.x86_64
os.vcpus = 4

[INFO] 2023-10-10 10:39:21,321 [main]
org.apache.kafka.connect.cli.ConnectDistributed startConnect - Scanning for
plugin classes. This might take a moment ...
[INFO] 2023-10-10 10:39:21,347 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/cp-base-new
[INFO] 2023-10-10 10:39:22,884 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/cp-base-new/}
[INFO] 2023-10-10 10:39:22,885 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider'
[INFO] 2023-10-10 10:39:22,885 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.common.config.provider.DirectoryConfigProvider'
[INFO] 2023-10-10 10:39:22,885 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:39:22,885 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:39:22,885 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:39:22,887 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/acl
[INFO] 2023-10-10 10:39:29,447 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/acl/}
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.DoubleConverter'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.FloatConverter'
[INFO] 2023-10-10 10:39:29,448 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.confluent.connect.avro.AvroConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.IntegerConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.LongConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.confluent.connect.json.JsonSchemaConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.confluent.connect.protobuf.ProtobufConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.json.JsonConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.storage.StringConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.converters.ShortConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter'
[INFO] 2023-10-10 10:39:29,449 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.Filter'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Key'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.RegexRouter'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value'
[INFO] 2023-10-10 10:39:29,450 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.ValueToKey'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.DropHeaders'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.Cast$Key'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.Cast$Value'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.InsertHeader'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Value'
[INFO] 2023-10-10 10:39:29,451 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.transforms.TimestampConverter$Value'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.transforms.predicates.TopicNameMatches'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'io.confluent.kafka.secretregistry.client.config.provider.SecretConfigProvider'
[INFO] 2023-10-10 10:39:29,452 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.confluent.connect.security.ConnectSecurityExtension'
[INFO] 2023-10-10 10:39:29,460 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from:
/usr/share/java/confluent-control-center
[INFO] 2023-10-10 10:39:33,946 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-control-center/}
[INFO] 2023-10-10 10:39:33,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/confluent-hub-client
[INFO] 2023-10-10 10:39:34,181 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-hub-client/}
[INFO] 2023-10-10 10:39:34,182 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/kafka-serde-tools
[INFO] 2023-10-10 10:39:34,914 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-serde-tools/}
[INFO] 2023-10-10 10:39:34,914 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from:
/usr/share/java/monitoring-interceptors
[INFO] 2023-10-10 10:39:35,144 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/monitoring-interceptors/}
[INFO] 2023-10-10 10:39:35,146 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/schema-registry
[INFO] 2023-10-10 10:39:36,485 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/schema-registry/}
[INFO] 2023-10-10 10:39:36,486 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/ce-kafka-http-server
[INFO] 2023-10-10 10:39:37,021 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/ce-kafka-http-server/}
[INFO] 2023-10-10 10:39:37,023 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from:
/usr/share/java/ce-kafka-rest-extensions
[INFO] 2023-10-10 10:39:37,227 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/ce-kafka-rest-extensions/}
[INFO] 2023-10-10 10:39:37,228 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/ce-kafka-rest-servlet
[INFO] 2023-10-10 10:39:37,234 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/ce-kafka-rest-servlet/}
[INFO] 2023-10-10 10:39:37,235 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/confluent-common
[INFO] 2023-10-10 10:39:37,273 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-common/}
[INFO] 2023-10-10 10:39:37,274 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from:
/usr/share/java/confluent-metadata-service
[INFO] 2023-10-10 10:39:38,252 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-metadata-service/}
[INFO] 2023-10-10 10:39:38,269 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/confluent-rebalancer
[INFO] 2023-10-10 10:39:40,536 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-rebalancer/}
[INFO] 2023-10-10 10:39:40,538 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/confluent-security
[INFO] 2023-10-10 10:39:47,443 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-security/}
[INFO] 2023-10-10 10:39:47,445 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/confluent-telemetry
[INFO] 2023-10-10 10:39:48,030 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/confluent-telemetry/}
[INFO] 2023-10-10 10:39:48,031 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/kafka
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/kafka/}
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector'
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector'
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector'
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'io.confluent.connect.rest.datapreview.extension.util.PreviewRecordTransformer'
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension'
[INFO] 2023-10-10 10:39:50,724 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin
'io.confluent.connect.rest.datapreview.extension.ConnectorDataPreviewRestExtension'
[INFO] 2023-10-10 10:39:50,725 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/kafka-rest-bin
[INFO] 2023-10-10 10:39:50,744 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-rest-bin/}
[INFO] 2023-10-10 10:39:50,745 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/kafka-rest-lib
[INFO] 2023-10-10 10:39:51,806 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/kafka-rest-lib/}
[INFO] 2023-10-10 10:39:51,807 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from: /usr/share/java/rest-utils
[INFO] 2023-10-10 10:39:52,241 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/java/rest-utils/}
[INFO] 2023-10-10 10:39:52,243 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
registerPlugin - Loading plugin from:
/usr/share/confluent-hub-components/debezium-debezium-connector-sqlserver
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
PluginClassLoader{pluginLocation=file:/usr/share/confluent-hub-components/debezium-debezium-connector-sqlserver/}
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.connector.sqlserver.SqlServerConnector'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.converters.ByteArrayConverter'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.converters.BinaryDataConverter'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.converters.CloudEventsConverter'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.ExtractNewRecordState'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.outbox.EventRouter'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.ExtractChangedRecordState'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.HeaderToValue'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.partitions.PartitionRouting'
[INFO] 2023-10-10 10:39:52,411 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.ByLogicalTableRouter'
[INFO] 2023-10-10 10:39:52,412 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.partitions.ComputePartition'
[INFO] 2023-10-10 10:39:52,412 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addPlugins
- Added plugin 'io.debezium.transforms.tracing.ActivateTracingSpan'
[INFO] 2023-10-10 10:40:02,945 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader
scanUrlsAndAddPlugins - Registered loader:
jdk.internal.loader.ClassLoaders$AppClassLoader@2c13da15
[INFO] 2023-10-10 10:40:02,947 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'SqlServerConnector' and 'SqlServer' to plugin
'io.debezium.connector.sqlserver.SqlServerConnector'
[INFO] 2023-10-10 10:40:02,947 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to
plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin
'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin
'org.apache.kafka.connect.mirror.MirrorSourceConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'MockSinkConnector' and 'MockSink' to plugin
'org.apache.kafka.connect.tools.MockSinkConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'MockSourceConnector' and 'MockSource' to plugin
'org.apache.kafka.connect.tools.MockSourceConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin
'org.apache.kafka.connect.tools.SchemaSourceConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin
'org.apache.kafka.connect.tools.VerifiableSinkConnector'
[INFO] 2023-10-10 10:40:02,948 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to
plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'AvroConverter' and 'Avro' to plugin
'io.confluent.connect.avro.AvroConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'JsonSchemaConverter' and 'JsonSchema' to plugin
'io.confluent.connect.json.JsonSchemaConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'ProtobufConverter' and 'Protobuf' to plugin
'io.confluent.connect.protobuf.ProtobufConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'BinaryDataConverter' and 'BinaryData' to plugin
'io.debezium.converters.BinaryDataConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'CloudEventsConverter' and 'CloudEvents' to plugin
'io.debezium.converters.CloudEventsConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'DoubleConverter' and 'Double' to plugin
'org.apache.kafka.connect.converters.DoubleConverter'
[INFO] 2023-10-10 10:40:02,949 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'FloatConverter' and 'Float' to plugin
'org.apache.kafka.connect.converters.FloatConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'IntegerConverter' and 'Integer' to plugin
'org.apache.kafka.connect.converters.IntegerConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'LongConverter' and 'Long' to plugin
'org.apache.kafka.connect.converters.LongConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'ShortConverter' and 'Short' to plugin
'org.apache.kafka.connect.converters.ShortConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'JsonConverter' and 'Json' to plugin
'org.apache.kafka.connect.json.JsonConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'StringConverter' and 'String' to plugin
'org.apache.kafka.connect.storage.StringConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'BinaryDataConverter' and 'BinaryData' to plugin
'io.debezium.converters.BinaryDataConverter'
[INFO] 2023-10-10 10:40:02,950 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'DoubleConverter' and 'Double' to plugin
'org.apache.kafka.connect.converters.DoubleConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'FloatConverter' and 'Float' to plugin
'org.apache.kafka.connect.converters.FloatConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'IntegerConverter' and 'Integer' to plugin
'org.apache.kafka.connect.converters.IntegerConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'LongConverter' and 'Long' to plugin
'org.apache.kafka.connect.converters.LongConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'ShortConverter' and 'Short' to plugin
'org.apache.kafka.connect.converters.ShortConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'JsonConverter' and 'Json' to plugin
'org.apache.kafka.connect.json.JsonConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'SimpleHeaderConverter' and 'Simple' to plugin
'org.apache.kafka.connect.storage.SimpleHeaderConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'StringConverter' and 'String' to plugin
'org.apache.kafka.connect.storage.StringConverter'
[INFO] 2023-10-10 10:40:02,951 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'PreviewRecordTransformer' to plugin
'io.confluent.connect.rest.datapreview.extension.util.PreviewRecordTransformer'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ByLogicalTableRouter' to plugin
'io.debezium.transforms.ByLogicalTableRouter'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ExtractChangedRecordState' to plugin
'io.debezium.transforms.ExtractChangedRecordState'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ExtractNewRecordState' to plugin
'io.debezium.transforms.ExtractNewRecordState'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'HeaderToValue' to plugin
'io.debezium.transforms.HeaderToValue'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'EventRouter' to plugin
'io.debezium.transforms.outbox.EventRouter'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ComputePartition' to plugin
'io.debezium.transforms.partitions.ComputePartition'
[INFO] 2023-10-10 10:40:02,952 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'PartitionRouting' to plugin
'io.debezium.transforms.partitions.PartitionRouting'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ActivateTracingSpan' to plugin
'io.debezium.transforms.tracing.ActivateTracingSpan'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'PredicatedTransformation' and 'Predicated' to plugin
'org.apache.kafka.connect.runtime.PredicatedTransformation'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'DropHeaders' to plugin
'org.apache.kafka.connect.transforms.DropHeaders'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'Filter' to plugin
'org.apache.kafka.connect.transforms.Filter'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'InsertHeader' to plugin
'org.apache.kafka.connect.transforms.InsertHeader'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'RegexRouter' to plugin
'org.apache.kafka.connect.transforms.RegexRouter'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'TimestampRouter' to plugin
'org.apache.kafka.connect.transforms.TimestampRouter'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ValueToKey' to plugin
'org.apache.kafka.connect.transforms.ValueToKey'
[INFO] 2023-10-10 10:40:02,953 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'HasHeaderKey' to plugin
'org.apache.kafka.connect.transforms.predicates.HasHeaderKey'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'RecordIsTombstone' to plugin
'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'TopicNameMatches' to plugin
'org.apache.kafka.connect.transforms.predicates.TopicNameMatches'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ConnectorDataPreviewRestExtension' to plugin
'io.confluent.connect.rest.datapreview.extension.ConnectorDataPreviewRestExtension'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'ConnectSecurityExtension' to plugin
'io.confluent.connect.security.ConnectSecurityExtension'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added alias 'BasicAuthSecurityRestExtension' to plugin
'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to
plugin
'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to
plugin
'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:40:02,954 [main]
org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader addAliases
- Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and
'Principal' to plugin
'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy'
[INFO] 2023-10-10 10:40:03,073 [main]
org.apache.kafka.connect.runtime.distributed.DistributedConfig logAll -
DistributedConfig values:
access.control.allow.methods =
access.control.allow.origin =
admin.listeners = null
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
config.providers = [file]
config.storage.replication.factor = 3
config.storage.topic = stc.dev.sql.src.config.storage.topic
confluent.connector.task.status.metrics = false
confluent.license = [hidden]
confluent.license.inject.into.connectors = true
confluent.topic = _confluent-command
confluent.topic.bootstrap.servers = []
confluent.topic.client.dns.lookup = use_all_dns_ips
confluent.topic.client.id =
confluent.topic.connections.max.idle.ms = 540000
confluent.topic.consumer.allow.auto.create.topics = true
confluent.topic.consumer.auto.commit.interval.ms = 5000
confluent.topic.consumer.auto.offset.reset = latest
confluent.topic.consumer.check.crcs = true
confluent.topic.consumer.client.dns.lookup = use_all_dns_ips
confluent.topic.consumer.client.id =
confluent.topic.consumer.client.rack =
confluent.topic.consumer.connections.max.idle.ms = 540000
confluent.topic.consumer.default.api.timeout.ms = 60000
confluent.topic.consumer.enable.auto.commit = true
confluent.topic.consumer.exclude.internal.topics = true
confluent.topic.consumer.fetch.max.bytes = 52428800
confluent.topic.consumer.fetch.max.wait.ms = 500
confluent.topic.consumer.fetch.min.bytes = 1
confluent.topic.consumer.group.id = null
confluent.topic.consumer.group.instance.id = null
confluent.topic.consumer.heartbeat.interval.ms = 3000
confluent.topic.consumer.interceptor.classes = []
confluent.topic.consumer.internal.leave.group.on.close = true

confluent.topic.consumer.internal.throw.on.fetch.stable.offset.unsupported
= false
confluent.topic.consumer.isolation.level = read_uncommitted
confluent.topic.consumer.max.partition.fetch.bytes = 1048576
confluent.topic.consumer.max.poll.interval.ms = 300000
confluent.topic.consumer.max.poll.records = 500
confluent.topic.consumer.metadata.max.age.ms = 300000
confluent.topic.consumer.metric.reporters = []
confluent.topic.consumer.metrics.num.samples = 2
confluent.topic.consumer.metrics.recording.level = INFO
confluent.topic.consumer.metrics.sample.window.ms = 30000
confluent.topic.consumer.partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
confluent.topic.consumer.receive.buffer.bytes = 65536
confluent.topic.consumer.reconnect.backoff.max.ms = 1000
confluent.topic.consumer.reconnect.backoff.ms = 50
confluent.topic.consumer.request.timeout.ms = 30000
confluent.topic.consumer.retry.backoff.ms = 100
confluent.topic.consumer.sasl.client.callback.handler.class = null
confluent.topic.consumer.sasl.jaas.config = null
confluent.topic.consumer.sasl.kerberos.kinit.cmd = /usr/bin/kinit
confluent.topic.consumer.sasl.kerberos.min.time.before.relogin =
60000
confluent.topic.consumer.sasl.kerberos.service.name = null
confluent.topic.consumer.sasl.kerberos.ticket.renew.jitter = 0.05
confluent.topic.consumer.sasl.kerberos.ticket.renew.window.factor =
0.8
confluent.topic.consumer.sasl.login.callback.handler.class = null
confluent.topic.consumer.sasl.login.class = null
confluent.topic.consumer.sasl.login.connect.timeout.ms = null
confluent.topic.consumer.sasl.login.read.timeout.ms = null
confluent.topic.consumer.sasl.login.refresh.buffer.seconds = 300
confluent.topic.consumer.sasl.login.refresh.min.period.seconds = 60
confluent.topic.consumer.sasl.login.refresh.window.factor = 0.8
confluent.topic.consumer.sasl.login.refresh.window.jitter = 0.05
confluent.topic.consumer.sasl.login.retry.backoff.max.ms = 10000
confluent.topic.consumer.sasl.login.retry.backoff.ms = 100
confluent.topic.consumer.sasl.mechanism = GSSAPI
confluent.topic.consumer.sasl.oauthbearer.clock.skew.seconds = 30
confluent.topic.consumer.sasl.oauthbearer.expected.audience = null
confluent.topic.consumer.sasl.oauthbearer.expected.issuer = null
confluent.topic.consumer.sasl.oauthbearer.jwks.endpoint.refresh.ms
= 3600000

confluent.topic.consumer.sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms
= 10000

confluent.topic.consumer.sasl.oauthbearer.jwks.endpoint.retry.backoff.ms =
100
confluent.topic.consumer.sasl.oauthbearer.jwks.endpoint.url = null
confluent.topic.consumer.sasl.oauthbearer.scope.claim.name = scope
confluent.topic.consumer.sasl.oauthbearer.sub.claim.name = sub
confluent.topic.consumer.sasl.oauthbearer.token.endpoint.url = null
confluent.topic.consumer.security.protocol = PLAINTEXT
confluent.topic.consumer.security.providers = null
confluent.topic.consumer.send.buffer.bytes = 131072
confluent.topic.consumer.session.timeout.ms = 45000
confluent.topic.consumer.socket.connection.setup.timeout.max.ms =
30000
confluent.topic.consumer.socket.connection.setup.timeout.ms = 10000
confluent.topic.consumer.ssl.cipher.suites = null
confluent.topic.consumer.ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
confluent.topic.consumer.ssl.endpoint.identification.algorithm =
https
confluent.topic.consumer.ssl.engine.factory.class = null
confluent.topic.consumer.ssl.key.password = null
confluent.topic.consumer.ssl.keymanager.algorithm = SunX509
confluent.topic.consumer.ssl.keystore.certificate.chain = null
confluent.topic.consumer.ssl.keystore.key = null
confluent.topic.consumer.ssl.keystore.location = null
confluent.topic.consumer.ssl.keystore.password = null
confluent.topic.consumer.ssl.keystore.type = JKS
confluent.topic.consumer.ssl.protocol = TLSv1.3
confluent.topic.consumer.ssl.provider = null
confluent.topic.consumer.ssl.secure.random.implementation = null
confluent.topic.consumer.ssl.trustmanager.algorithm = PKIX
confluent.topic.consumer.ssl.truststore.certificates = null
confluent.topic.consumer.ssl.truststore.location = null
confluent.topic.consumer.ssl.truststore.password = null
confluent.topic.consumer.ssl.truststore.type = JKS
confluent.topic.interceptor.classes = []
confluent.topic.metadata.max.age.ms = 300000
confluent.topic.metric.reporters = []
confluent.topic.metrics.num.samples = 2
confluent.topic.metrics.recording.level = INFO
confluent.topic.metrics.sample.window.ms = 30000
confluent.topic.producer.acks = all
confluent.topic.producer.batch.size = 16384
confluent.topic.producer.buffer.memory = 33554432
confluent.topic.producer.client.dns.lookup = use_all_dns_ips
confluent.topic.producer.client.id =
confluent.topic.producer.compression.type = none
confluent.topic.producer.connections.max.idle.ms = 540000
confluent.topic.producer.delivery.timeout.ms = 120000
confluent.topic.producer.enable.idempotence = true
confluent.topic.producer.interceptor.classes = []
confluent.topic.producer.linger.ms = 0
confluent.topic.producer.max.block.ms = 60000
confluent.topic.producer.max.in.flight.requests.per.connection = 5
confluent.topic.producer.max.request.size = 1048576
confluent.topic.producer.metadata.max.age.ms = 300000
confluent.topic.producer.metadata.max.idle.ms = 300000
confluent.topic.producer.metric.reporters = []
confluent.topic.producer.metrics.num.samples = 2
confluent.topic.producer.metrics.recording.level = INFO
confluent.topic.producer.metrics.sample.window.ms = 30000
confluent.topic.producer.partitioner.adaptive.partitioning.enable =
true
confluent.topic.producer.partitioner.availability.timeout.ms = 0
confluent.topic.producer.partitioner.class = null
confluent.topic.producer.partitioner.ignore.keys = false
confluent.topic.producer.receive.buffer.bytes = 32768
confluent.topic.producer.reconnect.backoff.max.ms = 1000
confluent.topic.producer.reconnect.backoff.ms = 50
confluent.topic.producer.request.timeout.ms = 30000
confluent.topic.producer.retry.backoff.ms = 100
confluent.topic.producer.sasl.client.callback.handler.class = null
confluent.topic.producer.sasl.jaas.config = null
confluent.topic.producer.sasl.kerberos.kinit.cmd = /usr/bin/kinit
confluent.topic.producer.sasl.kerberos.min.time.before.relogin =
60000
confluent.topic.producer.sasl.kerberos.service.name = null
confluent.topic.producer.sasl.kerberos.ticket.renew.jitter = 0.05
confluent.topic.producer.sasl.kerberos.ticket.renew.window.factor =
0.8
confluent.topic.producer.sasl.login.callback.handler.class = null
confluent.topic.producer.sasl.login.class = null
confluent.topic.producer.sasl.login.connect.timeout.ms = null
confluent.topic.producer.sasl.login.read.timeout.ms = null
confluent.topic.producer.sasl.login.refresh.buffer.seconds = 300
confluent.topic.producer.sasl.login.refresh.min.period.seconds = 60
confluent.topic.producer.sasl.login.refresh.window.factor = 0.8
confluent.topic.producer.sasl.login.refresh.window.jitter = 0.05
confluent.topic.producer.sasl.login.retry.backoff.max.ms = 10000
confluent.topic.producer.sasl.login.retry.backoff.ms = 100
confluent.topic.producer.sasl.mechanism = GSSAPI
confluent.topic.producer.sasl.oauthbearer.clock.skew.seconds = 30
confluent.topic.producer.sasl.oauthbearer.expected.audience = null
confluent.topic.producer.sasl.oauthbearer.expected.issuer = null
confluent.topic.producer.sasl.oauthbearer.jwks.endpoint.refresh.ms
= 3600000

confluent.topic.producer.sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms
= 10000

confluent.topic.producer.sasl.oauthbearer.jwks.endpoint.retry.backoff.ms =
100
confluent.topic.producer.sasl.oauthbearer.jwks.endpoint.url = null
confluent.topic.producer.sasl.oauthbearer.scope.claim.name = scope
confluent.topic.producer.sasl.oauthbearer.sub.claim.name = sub
confluent.topic.producer.sasl.oauthbearer.token.endpoint.url = null
confluent.topic.producer.security.protocol = PLAINTEXT
confluent.topic.producer.security.providers = null
confluent.topic.producer.send.buffer.bytes = 131072
confluent.topic.producer.socket.connection.setup.timeout.max.ms =
30000
confluent.topic.producer.socket.connection.setup.timeout.ms = 10000
confluent.topic.producer.ssl.cipher.suites = null
confluent.topic.producer.ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
confluent.topic.producer.ssl.endpoint.identification.algorithm =
https
confluent.topic.producer.ssl.engine.factory.class = null
confluent.topic.producer.ssl.key.password = null
confluent.topic.producer.ssl.keymanager.algorithm = SunX509
confluent.topic.producer.ssl.keystore.certificate.chain = null
confluent.topic.producer.ssl.keystore.key = null
confluent.topic.producer.ssl.keystore.location = null
confluent.topic.producer.ssl.keystore.password = null
confluent.topic.producer.ssl.keystore.type = JKS
confluent.topic.producer.ssl.protocol = TLSv1.3
confluent.topic.producer.ssl.provider = null
confluent.topic.producer.ssl.secure.random.implementation = null
confluent.topic.producer.ssl.trustmanager.algorithm = PKIX
confluent.topic.producer.ssl.truststore.certificates = null
confluent.topic.producer.ssl.truststore.location = null
confluent.topic.producer.ssl.truststore.password = null
confluent.topic.producer.ssl.truststore.type = JKS
confluent.topic.producer.transaction.timeout.ms = 60000
confluent.topic.producer.transactional.id = null
confluent.topic.receive.buffer.bytes = 32768
confluent.topic.reconnect.backoff.max.ms = 1000
confluent.topic.reconnect.backoff.ms = 50
confluent.topic.replication.factor = 3
confluent.topic.request.timeout.ms = 30000
confluent.topic.retry.backoff.ms = 100
confluent.topic.sasl.client.callback.handler.class = null
confluent.topic.sasl.jaas.config = null
confluent.topic.sasl.kerberos.kinit.cmd = /usr/bin/kinit
confluent.topic.sasl.kerberos.min.time.before.relogin = 60000
confluent.topic.sasl.kerberos.service.name = null
confluent.topic.sasl.kerberos.ticket.renew.jitter = 0.05
confluent.topic.sasl.kerberos.ticket.renew.window.factor = 0.8
confluent.topic.sasl.login.callback.handler.class = null
confluent.topic.sasl.login.class = null
confluent.topic.sasl.login.connect.timeout.ms = null
confluent.topic.sasl.login.read.timeout.ms = null
confluent.topic.sasl.login.refresh.buffer.seconds = 300
confluent.topic.sasl.login.refresh.min.period.seconds = 60
confluent.topic.sasl.login.refresh.window.factor = 0.8
confluent.topic.sasl.login.refresh.window.jitter = 0.05
confluent.topic.sasl.login.retry.backoff.max.ms = 10000
confluent.topic.sasl.login.retry.backoff.ms = 100
confluent.topic.sasl.mechanism = GSSAPI
confluent.topic.sasl.oauthbearer.clock.skew.seconds = 30
confluent.topic.sasl.oauthbearer.expected.audience = null
confluent.topic.sasl.oauthbearer.expected.issuer = null
confluent.topic.sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
confluent.topic.sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms
= 10000
confluent.topic.sasl.oauthbearer.jwks.endpoint.retry.backoff.ms =
100
confluent.topic.sasl.oauthbearer.jwks.endpoint.url = null
confluent.topic.sasl.oauthbearer.scope.claim.name = scope
confluent.topic.sasl.oauthbearer.sub.claim.name = sub
confluent.topic.sasl.oauthbearer.token.endpoint.url = null
confluent.topic.security.protocol = PLAINTEXT
confluent.topic.security.providers = null
confluent.topic.send.buffer.bytes = 131072
confluent.topic.socket.connection.setup.timeout.max.ms = 30000
confluent.topic.socket.connection.setup.timeout.ms = 10000
confluent.topic.ssl.cipher.suites = null
confluent.topic.ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
confluent.topic.ssl.endpoint.identification.algorithm = https
confluent.topic.ssl.engine.factory.class = null
confluent.topic.ssl.key.password = null
confluent.topic.ssl.keymanager.algorithm = SunX509
confluent.topic.ssl.keystore.certificate.chain = null
confluent.topic.ssl.keystore.key = null
confluent.topic.ssl.keystore.location = null
confluent.topic.ssl.keystore.password = null
confluent.topic.ssl.keystore.type = JKS
confluent.topic.ssl.protocol = TLSv1.3
confluent.topic.ssl.provider = null
confluent.topic.ssl.secure.random.implementation = null
confluent.topic.ssl.trustmanager.algorithm = PKIX
confluent.topic.ssl.truststore.certificates = null
confluent.topic.ssl.truststore.location = null
confluent.topic.ssl.truststore.password = null
confluent.topic.ssl.truststore.type = JKS
connect.protocol = sessioned
connections.max.idle.ms = 540000
connector.client.config.override.policy = All
exactly.once.source.support = disabled
group.id = stc.dev.connect.cluster.consumer.7
header.converter = class
org.apache.kafka.connect.storage.SimpleHeaderConverter
heartbeat.interval.ms = 3000
inter.worker.key.generation.algorithm = HmacSHA256
inter.worker.key.size = null
inter.worker.key.ttl.ms = 3600000
inter.worker.signature.algorithm = HmacSHA256
inter.worker.verification.algorithms = [HmacSHA256]
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = [https://0.0.0.0:8083]
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 10000
offset.flush.timeout.ms = 5000
offset.storage.partitions = 25
offset.storage.replication.factor = 3
offset.storage.topic = stc.dev.sql.src.offset.storage.topic
plugin.path = [/usr/share/java, /usr/share/confluent-hub-components]
rebalance.timeout.ms = 60000
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
response.http.headers.config =
rest.advertised.host.name =
sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local
rest.advertised.listener = https
rest.advertised.port = null
rest.extension.classes = []
rest.servlet.initializor.classes = []
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
scheduled.rebalance.max.delay.ms = 300000
security.protocol = SSL
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.client.auth = none
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
status.storage.partitions = 5
status.storage.replication.factor = 3
status.storage.topic = stc.dev.sql.src.status.storage.topic
task.shutdown.graceful.timeout.ms = 5000
topic.creation.enable = true
topic.tracking.allow.reset = true
topic.tracking.enable = true
value.converter = class org.apache.kafka.connect.json.JsonConverter
worker.sync.timeout.ms = 3000
worker.unsync.backoff.ms = 300000

[INFO] 2023-10-10 10:40:03,077 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:03,083 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:03,331 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:03,332 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:03,332 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:03,332 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934403331
[INFO] 2023-10-10 10:40:04,257 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:04,258 [kafka-admin-client-thread | adminclient-1]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-1 unregistered
[INFO] 2023-10-10 10:40:04,267 [kafka-admin-client-thread | adminclient-1]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:04,267 [kafka-admin-client-thread | adminclient-1]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:04,267 [kafka-admin-client-thread | adminclient-1]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:04,282 [main] org.eclipse.jetty.util.log
initialized - Logging initialized @45009ms to
org.eclipse.jetty.util.log.Slf4jLog
[INFO] 2023-10-10 10:40:04,341 [main]
org.apache.kafka.connect.runtime.rest.RestServer createConnectors - Added
connector for https://0.0.0.0:8083
[INFO] 2023-10-10 10:40:04,342 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeServer -
Initializing REST server
[INFO] 2023-10-10 10:40:04,349 [main] org.eclipse.jetty.server.Server
doStart - jetty-9.4.48.v20220622; built: 2022-06-21T20:42:25.880Z; git:
6b67c5719d1f4371b33655ff2d047d24e171e49a; jvm 11.0.18+10-LTS
[INFO] 2023-10-10 10:40:04,381 [main]
org.eclipse.jetty.util.ssl.SslContextFactory load -
x509=X509@3c87f534(1,h=[service-stcsqlt,
sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local,
sqlserver-connect-cluster-1.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local,
sqlserver-connect-cluster-2.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local],a=[],w=[])
for
Server@52b546d8[provider=null,keyStore=file:///mnt/sslcerts/keystore.p12,trustStore=file:///mnt/sslcerts/truststore.p12]
[INFO] 2023-10-10 10:40:04,401 [main]
org.eclipse.jetty.server.AbstractConnector doStart - Started
https_0.0.0.08083@1e977740{SSL, (ssl, http/1.1)}{0.0.0.0:8083}
[INFO] 2023-10-10 10:40:04,401 [main] org.eclipse.jetty.server.Server
doStart - Started @45128ms
[INFO] 2023-10-10 10:40:04,430 [main]
org.apache.kafka.connect.runtime.rest.RestServer advertisedUrl - Advertised
URI:
https://sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/
[INFO] 2023-10-10 10:40:04,430 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeServer - REST
server listening at https://0.0.0.0:8083/, advertising URL
https://sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/
[INFO] 2023-10-10 10:40:04,431 [main]
org.apache.kafka.connect.runtime.rest.RestServer advertisedUrl - Advertised
URI:
https://sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/
[INFO] 2023-10-10 10:40:04,431 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeServer - REST
admin endpoints at
https://sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/
[INFO] 2023-10-10 10:40:04,431 [main]
org.apache.kafka.connect.runtime.rest.RestServer advertisedUrl - Advertised
URI:
https://sqlserver-connect-cluster-0.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/
[INFO] 2023-10-10 10:40:04,436 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:04,437 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:04,455 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:04,455 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:04,455 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:04,455 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934404455
[INFO] 2023-10-10 10:40:04,516 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:04,517 [kafka-admin-client-thread | adminclient-2]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-2 unregistered
[INFO] 2023-10-10 10:40:04,521 [kafka-admin-client-thread | adminclient-2]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:04,521 [kafka-admin-client-thread | adminclient-2]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:04,521 [kafka-admin-client-thread | adminclient-2]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:04,527 [main]
org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy
configure - Setting up All Policy for ConnectorClientConfigOverride. This
will allow all client configurations to be overridden
[INFO] 2023-10-10 10:40:04,538 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:04,538 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:04,558 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:04,559 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:04,559 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:04,559 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934404558
[INFO] 2023-10-10 10:40:04,671 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:04,672 [kafka-admin-client-thread | adminclient-3]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-3 unregistered
[INFO] 2023-10-10 10:40:04,675 [kafka-admin-client-thread | adminclient-3]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:04,675 [kafka-admin-client-thread | adminclient-3]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:04,675 [kafka-admin-client-thread | adminclient-3]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:04,680 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:04,680 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:04,680 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934404680
[INFO] 2023-10-10 10:40:04,858 [main]
org.apache.kafka.connect.json.JsonConverterConfig logAll -
JsonConverterConfig values:
converter.type = key
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false

[INFO] 2023-10-10 10:40:04,859 [main]
org.apache.kafka.connect.json.JsonConverterConfig logAll -
JsonConverterConfig values:
converter.type = value
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false

[INFO] 2023-10-10 10:40:04,860 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:04,860 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:04,873 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:04,874 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:04,874 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:04,874 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934404874
[INFO] 2023-10-10 10:40:04,926 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:04,926 [kafka-admin-client-thread | adminclient-4]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-4 unregistered
[INFO] 2023-10-10 10:40:04,929 [kafka-admin-client-thread | adminclient-4]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:04,929 [kafka-admin-client-thread | adminclient-4]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:04,929 [kafka-admin-client-thread | adminclient-4]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:04,941 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:04,941 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:04,954 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:04,954 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:04,954 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:04,954 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934404954
[INFO] 2023-10-10 10:40:05,047 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,048 [kafka-admin-client-thread | adminclient-5]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-5 unregistered
[INFO] 2023-10-10 10:40:05,050 [kafka-admin-client-thread | adminclient-5]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,050 [kafka-admin-client-thread | adminclient-5]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,050 [kafka-admin-client-thread | adminclient-5]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,057 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:05,058 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,071 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,071 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,071 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,071 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405071
[INFO] 2023-10-10 10:40:05,125 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,126 [kafka-admin-client-thread | adminclient-6]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-6 unregistered
[INFO] 2023-10-10 10:40:05,128 [kafka-admin-client-thread | adminclient-6]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,128 [kafka-admin-client-thread | adminclient-6]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,128 [kafka-admin-client-thread | adminclient-6]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,128 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:05,129 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,142 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,142 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,143 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,143 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405142
[INFO] 2023-10-10 10:40:05,239 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,239 [kafka-admin-client-thread | adminclient-7]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-7 unregistered
[INFO] 2023-10-10 10:40:05,242 [kafka-admin-client-thread | adminclient-7]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,242 [kafka-admin-client-thread | adminclient-7]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,242 [kafka-admin-client-thread | adminclient-7]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,243 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:05,244 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,260 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,261 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,261 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,261 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405261
[INFO] 2023-10-10 10:40:05,316 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,319 [kafka-admin-client-thread | adminclient-8]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-8 unregistered
[INFO] 2023-10-10 10:40:05,321 [kafka-admin-client-thread | adminclient-8]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,321 [kafka-admin-client-thread | adminclient-8]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,321 [kafka-admin-client-thread | adminclient-8]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,321 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:05,322 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,343 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,343 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,344 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,344 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405343
[INFO] 2023-10-10 10:40:05,445 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,446 [kafka-admin-client-thread | adminclient-9]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-9 unregistered
[INFO] 2023-10-10 10:40:05,448 [kafka-admin-client-thread | adminclient-9]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,448 [kafka-admin-client-thread | adminclient-9]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,449 [kafka-admin-client-thread | adminclient-9]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,475 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Creating
Kafka admin client
[INFO] 2023-10-10 10:40:05,476 [main]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,491 [main]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, rest.advertised.host.name,
rest.advertised.host.port, consumer.bootstrap.servers,
config.storage.replication.factor, key.converter.schemas.enable,
value.converter.schema.registry.url, producer.ssl.key.password,
value.converter.schemas.enable, admin.ssl.truststore.location,
admin.bootstrap.servers, producer.bootstrap.servers, group.id, plugin.path,
admin.security.protocol, consumer.ssl.truststore.location,
producer.ssl.truststore.location, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,491 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,491 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,491 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405491
[INFO] 2023-10-10 10:40:05,563 [main]
org.apache.kafka.connect.util.ConnectUtils lookupKafkaClusterId - Kafka
cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,564 [kafka-admin-client-thread | adminclient-10]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for adminclient-10 unregistered
[INFO] 2023-10-10 10:40:05,565 [kafka-admin-client-thread | adminclient-10]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:05,565 [kafka-admin-client-thread | adminclient-10]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:05,565 [kafka-admin-client-thread | adminclient-10]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:05,605 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,605 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,605 [main]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405605
[INFO] 2023-10-10 10:40:05,610 [main]
org.apache.kafka.connect.cli.ConnectDistributed startConnect - Kafka
Connect distributed worker initialization took 44289ms
[INFO] 2023-10-10 10:40:05,610 [main]
org.apache.kafka.connect.runtime.Connect start - Kafka Connect starting
[INFO] 2023-10-10 10:40:05,613 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeResources -
Initializing REST resources
[INFO] 2023-10-10 10:40:05,613 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder run -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Herder starting
[INFO] 2023-10-10 10:40:05,614 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.Worker start - Worker starting
[INFO] 2023-10-10 10:40:05,614 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaOffsetBackingStore start - Starting
KafkaOffsetBackingStore
[INFO] 2023-10-10 10:40:05,614 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Starting KafkaBasedLog
with topic stc.dev.sql.src.offset.storage.topic
[INFO] 2023-10-10 10:40:05,617 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id =
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:05,632 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, group.id, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,633 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,633 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,633 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405633
[INFO] 2023-10-10 10:40:05,683 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeResources -
Adding admin resources to main listener
[INFO] 2023-10-10 10:40:05,775 [main] org.eclipse.jetty.server.session
doStart - DefaultSessionIdManager workerName=node0
[INFO] 2023-10-10 10:40:05,775 [main] org.eclipse.jetty.server.session
doStart - No SessionScavenger set, using defaults
[INFO] 2023-10-10 10:40:05,777 [main] org.eclipse.jetty.server.session
startScavenging - node0 Scavenging every 660000ms
[INFO] 2023-10-10 10:40:05,802 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.TopicAdmin
verifyTopicCleanupPolicyOnlyCompact - Unable to use admin client to verify
the cleanup policy of 'stc.dev.sql.src.offset.storage.topic' topic is
'compact', either because the broker is an older version or because the
Kafka principal used for Connect internal topics does not have the required
permission to describe topic configurations.
[INFO] 2023-10-10 10:40:05,812 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logAll - ProducerConfig
values:
acks = -1
batch.size = 16384
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-1
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 2147483647
enable.idempotence = false
interceptor.classes = []
key.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.adaptive.partitioning.enable = true
partitioner.availability.timeout.ms = 0
partitioner.class = null
partitioner.ignore.keys = false
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer

[WARN] 2023-10-10 10:40:05,855 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logUnused - These
configurations '[metrics.context.resource.version,
producer.ssl.truststore.password, listeners.https.ssl.truststore.password,
metrics.context.resource.type, metrics.context.resource.commit.id,
config.providers, admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, group.id, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,855 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,856 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,857 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405855
[INFO] 2023-10-10 10:40:05,869 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-stc.dev.connect.cluster.consumer.7-1
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc.dev.connect.cluster.consumer.7
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer

[WARN] 2023-10-10 10:40:05,942 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logUnused - These
configurations '[producer.ssl.truststore.password,
listeners.https.ssl.truststore.password, config.providers,
admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:05,942 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:05,943 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:05,943 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934405942
[INFO] 2023-10-10 10:40:05,946 [kafka-producer-network-thread | producer-1]
org.apache.kafka.clients.Metadata update - [Producer clientId=producer-1]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,971 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:05,975 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.KafkaConsumer assign - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Assigned to partition(s):
stc.dev.sql.src.offset.storage.topic-0,
stc.dev.sql.src.offset.storage.topic-5,
stc.dev.sql.src.offset.storage.topic-10,
stc.dev.sql.src.offset.storage.topic-13,
stc.dev.sql.src.offset.storage.topic-8,
stc.dev.sql.src.offset.storage.topic-2,
stc.dev.sql.src.offset.storage.topic-12,
stc.dev.sql.src.offset.storage.topic-14,
stc.dev.sql.src.offset.storage.topic-9,
stc.dev.sql.src.offset.storage.topic-11,
stc.dev.sql.src.offset.storage.topic-1,
stc.dev.sql.src.offset.storage.topic-4,
stc.dev.sql.src.offset.storage.topic-6,
stc.dev.sql.src.offset.storage.topic-7,
stc.dev.sql.src.offset.storage.topic-3
[INFO] 2023-10-10 10:40:05,983 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-0
[INFO] 2023-10-10 10:40:05,983 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-5
[INFO] 2023-10-10 10:40:05,983 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-10
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-13
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-8
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-2
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-12
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-14
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-9
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-11
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-1
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-4
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-6
[INFO] 2023-10-10 10:40:05,984 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-7
[INFO] 2023-10-10 10:40:05,985 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-1,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.offset.storage.topic-3
[INFO] 2023-10-10 10:40:06,305 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Finished reading
KafkaBasedLog for topic stc.dev.sql.src.offset.storage.topic
[INFO] 2023-10-10 10:40:06,305 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Started KafkaBasedLog
for topic stc.dev.sql.src.offset.storage.topic
[INFO] 2023-10-10 10:40:06,305 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaOffsetBackingStore start - Finished
reading offsets topic and starting KafkaOffsetBackingStore
[INFO] 2023-10-10 10:40:06,341 [DistributedHerder-connect-1-1]
io.confluent.logevents.connect.LogEventsConfig logAll - LogEventsConfig
values:
confluent.event.logger.cloudevent.codec = binary
confluent.event.logger.enable = false
confluent.event.logger.exporter.class = class
io.confluent.telemetry.events.exporter.kafka.EventKafkaExporter
confluent.event.logger.exporter.kafka.producer.bootstrap.servers =
confluent.event.logger.exporter.kafka.producer.client.id =
confluent-connect-log-events-emitter-stc.dev.connect.cluster.consumer.7
confluent.event.logger.exporter.kafka.topic.create = true
confluent.event.logger.exporter.kafka.topic.name =
confluent-connect-log-events
confluent.event.logger.exporter.kafka.type = kafka

[INFO] 2023-10-10 10:40:06,341 [DistributedHerder-connect-1-1]
io.confluent.logevents.connect.LogEventsKafkaEmitter start - Connect Log
Events aren't enabled.
[INFO] 2023-10-10 10:40:06,342 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.Worker start - Worker started
[INFO] 2023-10-10 10:40:06,342 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Starting KafkaBasedLog
with topic stc.dev.sql.src.status.storage.topic
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.inject.Providers
checkProviderRuntime
WARNING: A provider
org.apache.kafka.connect.runtime.rest.resources.RootResource registered in
SERVER runtime does not implement any provider interfaces applicable in the
SERVER runtime. Due to constraint configuration problems the provider
org.apache.kafka.connect.runtime.rest.resources.RootResource will be
ignored.
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.inject.Providers
checkProviderRuntime
WARNING: A provider
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource
registered in SERVER runtime does not implement any provider interfaces
applicable in the SERVER runtime. Due to constraint configuration problems
the provider
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be
ignored.
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.inject.Providers
checkProviderRuntime
WARNING: A provider
org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource
registered in SERVER runtime does not implement any provider interfaces
applicable in the SERVER runtime. Due to constraint configuration problems
the provider
org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource
will be ignored.
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.inject.Providers
checkProviderRuntime
WARNING: A provider
org.apache.kafka.connect.runtime.rest.resources.ConfluentV1MetadataResource
registered in SERVER runtime does not implement any provider interfaces
applicable in the SERVER runtime. Due to constraint configuration problems
the provider
org.apache.kafka.connect.runtime.rest.resources.ConfluentV1MetadataResource
will be ignored.
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.inject.Providers
checkProviderRuntime
WARNING: A provider
org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered
in SERVER runtime does not implement any provider interfaces applicable in
the SERVER runtime. Due to constraint configuration problems the provider
org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be
ignored.
[INFO] 2023-10-10 10:40:06,404 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.TopicAdmin
verifyTopicCleanupPolicyOnlyCompact - Unable to use admin client to verify
the cleanup policy of 'stc.dev.sql.src.status.storage.topic' topic is
'compact', either because the broker is an older version or because the
Kafka principal used for Connect internal topics does not have the required
permission to describe topic configurations.
[INFO] 2023-10-10 10:40:06,405 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logAll - ProducerConfig
values:
acks = -1
batch.size = 16384
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-2
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = false
interceptor.classes = []
key.serializer = class
org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.adaptive.partitioning.enable = true
partitioner.availability.timeout.ms = 0
partitioner.class = null
partitioner.ignore.keys = false
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 0
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer

[WARN] 2023-10-10 10:40:06,424 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logUnused - These
configurations '[metrics.context.resource.version,
producer.ssl.truststore.password, listeners.https.ssl.truststore.password,
metrics.context.resource.type, metrics.context.resource.commit.id,
config.providers, admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, group.id, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:06,425 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:06,426 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:06,426 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934406425
[INFO] 2023-10-10 10:40:06,427 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-stc.dev.connect.cluster.consumer.7-2
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc.dev.connect.cluster.consumer.7
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer

[WARN] 2023-10-10 10:40:06,451 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logUnused - These
configurations '[metrics.context.resource.version,
producer.ssl.truststore.password, listeners.https.ssl.truststore.password,
metrics.context.resource.type, metrics.context.resource.commit.id,
config.providers, admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:06,452 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:06,452 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:06,452 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934406452
[INFO] 2023-10-10 10:40:06,454 [kafka-producer-network-thread | producer-2]
org.apache.kafka.clients.Metadata update - [Producer clientId=producer-2]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:06,483 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:06,484 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.KafkaConsumer assign - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Assigned to partition(s):
stc.dev.sql.src.status.storage.topic-0,
stc.dev.sql.src.status.storage.topic-1,
stc.dev.sql.src.status.storage.topic-4,
stc.dev.sql.src.status.storage.topic-2,
stc.dev.sql.src.status.storage.topic-3
[INFO] 2023-10-10 10:40:06,484 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.status.storage.topic-0
[INFO] 2023-10-10 10:40:06,485 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.status.storage.topic-1
[INFO] 2023-10-10 10:40:06,485 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.status.storage.topic-4
[INFO] 2023-10-10 10:40:06,486 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.status.storage.topic-2
[INFO] 2023-10-10 10:40:06,486 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-2,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.status.storage.topic-3
[INFO] 2023-10-10 10:40:06,577 [main]
org.hibernate.validator.internal.util.Version <clinit> - HV000001:
Hibernate Validator 6.1.7.Final
[INFO] 2023-10-10 10:40:06,612 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Finished reading
KafkaBasedLog for topic stc.dev.sql.src.status.storage.topic
[INFO] 2023-10-10 10:40:06,612 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Started KafkaBasedLog
for topic stc.dev.sql.src.status.storage.topic
[INFO] 2023-10-10 10:40:06,615 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore start - Starting
KafkaConfigBackingStore
[INFO] 2023-10-10 10:40:06,615 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Starting KafkaBasedLog
with topic stc.dev.sql.src.config.storage.topic
[INFO] 2023-10-10 10:40:06,623 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.TopicAdmin
verifyTopicCleanupPolicyOnlyCompact - Unable to use admin client to verify
the cleanup policy of 'stc.dev.sql.src.config.storage.topic' topic is
'compact', either because the broker is an older version or because the
Kafka principal used for Connect internal topics does not have the required
permission to describe topic configurations.
[INFO] 2023-10-10 10:40:06,624 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logAll - ProducerConfig
values:
acks = -1
batch.size = 16384
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-3
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 2147483647
enable.idempotence = false
interceptor.classes = []
key.serializer = class
org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.adaptive.partitioning.enable = true
partitioner.availability.timeout.ms = 0
partitioner.class = null
partitioner.ignore.keys = false
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retries = 2147483647
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer

[WARN] 2023-10-10 10:40:06,650 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.producer.ProducerConfig logUnused - These
configurations '[metrics.context.resource.version,
producer.ssl.truststore.password, listeners.https.ssl.truststore.password,
metrics.context.resource.type, metrics.context.resource.commit.id,
config.providers, admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, group.id, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:06,652 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:06,652 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:06,652 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934406652
[INFO] 2023-10-10 10:40:06,653 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-stc.dev.connect.cluster.consumer.7-3
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc.dev.connect.cluster.consumer.7
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 20000
retry.backoff.ms = 500
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer

[WARN] 2023-10-10 10:40:06,675 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.ConsumerConfig logUnused - These
configurations '[metrics.context.resource.version,
producer.ssl.truststore.password, listeners.https.ssl.truststore.password,
metrics.context.resource.type, metrics.context.resource.commit.id,
config.providers, admin.ssl.key.password, listeners.https.ssl.key.password,
confluent.topic.replication.factor, status.storage.replication.factor,
config.providers.file.class, consumer.ssl.key.password,
consumer.security.protocol, offset.storage.topic,
consumer.ssl.keystore.location, rest.advertised.listener, key.converter,
consumer.ssl.truststore.password, listeners.https.ssl.enabled.protocols,
config.storage.topic, consumer.group.id, metrics.context.connect.group.id,
rest.advertised.host.name, rest.advertised.host.port,
consumer.bootstrap.servers, config.storage.replication.factor,
key.converter.schemas.enable, value.converter.schema.registry.url,
producer.ssl.key.password, value.converter.schemas.enable,
admin.ssl.truststore.location, admin.bootstrap.servers,
producer.bootstrap.servers, plugin.path, admin.security.protocol,
consumer.ssl.truststore.location, producer.ssl.truststore.location,
metrics.context.connect.kafka.cluster.id, value.converter,
producer.ssl.keystore.location, admin.ssl.keystore.location,
admin.ssl.truststore.password, expose.internal.connect.endpoints,
listeners, producer.security.protocol, status.storage.topic,
listeners.https.ssl.keystore.location,
listeners.https.ssl.keystore.password, offset.flush.interval.ms,
producer.ssl.keystore.password, listeners.https.ssl.truststore.location,
offset.storage.replication.factor, consumer.ssl.keystore.password,
key.converter.schema.registry.url, admin.ssl.keystore.password]' were
supplied but are not used yet.
[INFO] 2023-10-10 10:40:06,684 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:06,684 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:06,684 [DistributedHerder-connect-1-1]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934406684
[INFO] 2023-10-10 10:40:06,697 [kafka-producer-network-thread | producer-3]
org.apache.kafka.clients.Metadata update - [Producer clientId=producer-3]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:06,716 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-3,
groupId=stc.dev.connect.cluster.consumer.7] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:06,717 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.KafkaConsumer assign - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-3,
groupId=stc.dev.connect.cluster.consumer.7] Assigned to partition(s):
stc.dev.sql.src.config.storage.topic-0
[INFO] 2023-10-10 10:40:06,718 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.consumer.internals.SubscriptionState
lambda$requestOffsetReset$3 - [Consumer
clientId=consumer-stc.dev.connect.cluster.consumer.7-3,
groupId=stc.dev.connect.cluster.consumer.7] Seeking to earliest offset of
partition stc.dev.sql.src.config.storage.topic-0
Oct 10, 2023 10:40:06 AM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The
(sub)resource method listLoggers in
org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains
empty path annotation.
WARNING: The (sub)resource method createConnector in
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains
empty path annotation.
WARNING: The (sub)resource method listConnectors in
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains
empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in
org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource
contains empty path annotation.
WARNING: The (sub)resource method serverInfo in
org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty
path annotation.

[INFO] 2023-10-10 10:40:06,933 [main]
org.eclipse.jetty.server.handler.ContextHandler doStart - Started
o.e.j.s.ServletContextHandler@3ef71c16{/,null,AVAILABLE}
[INFO] 2023-10-10 10:40:06,933 [main]
org.apache.kafka.connect.runtime.rest.RestServer initializeResources - REST
resources initialized; server is started and ready to handle requests
[INFO] 2023-10-10 10:40:06,933 [main]
org.apache.kafka.connect.runtime.Connect start - Kafka Connect started
[INFO] 2023-10-10 10:40:07,030 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-source-connector'
[INFO] 2023-10-10 10:40:07,059 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-source-connector'
[INFO] 2023-10-10 10:40:07,064 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-type-source-connector'
[INFO] 2023-10-10 10:40:07,065 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-type-source-connector'
[INFO] 2023-10-10 10:40:07,071 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-source-connector'
[INFO] 2023-10-10 10:40:07,072 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-type-source-connector'
[INFO] 2023-10-10 10:40:07,072 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-source-connector'
[INFO] 2023-10-10 10:40:07,073 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore
processConnectorConfigRecord - Successfully processed removal of connector
'cdna-sql-party-type-source-connector'
[INFO] 2023-10-10 10:40:07,073 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Finished reading
KafkaBasedLog for topic stc.dev.sql.src.config.storage.topic
[INFO] 2023-10-10 10:40:07,074 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.util.KafkaBasedLog start - Started KafkaBasedLog
for topic stc.dev.sql.src.config.storage.topic
[INFO] 2023-10-10 10:40:07,074 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.storage.KafkaConfigBackingStore start - Started
KafkaConfigBackingStore
[INFO] 2023-10-10 10:40:07,074 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder run -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Herder started
[INFO] 2023-10-10 10:40:07,291 [DistributedHerder-connect-1-1]
org.apache.kafka.clients.Metadata update - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:07,293 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator onSuccess -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Discovered group coordinator
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093 (id: 2147483645 rack:
null)
[INFO] 2023-10-10 10:40:07,307 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator
onJoinPrepare - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Rebalance started
[INFO] 2023-10-10 10:40:07,308 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator
sendJoinGroupRequest - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] (Re-)joining group
[INFO] 2023-10-10 10:40:07,331 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator
requestRejoin - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Request joining group due to:
rebalance failed due to 'The group member needs to have a valid member id
before actually entering a consumer group.' (MemberIdRequiredException)
[INFO] 2023-10-10 10:40:07,331 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator
sendJoinGroupRequest - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] (Re-)joining group
[INFO] 2023-10-10 10:40:13,262 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator handle -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Successfully joined group with generation Generation{generationId=1,
memberId='connect-1-6344c64e-1ae1-497b-aa8b-8e45a369d71b',
protocol='sessioned'}
[INFO] 2023-10-10 10:40:13,285 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.WorkerCoordinator handle -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Successfully synced group in generation Generation{generationId=1,
memberId='connect-1-6344c64e-1ae1-497b-aa8b-8e45a369d71b',
protocol='sessioned'}
[INFO] 2023-10-10 10:40:13,286 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder onAssigned -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Joined group at generation 1 with protocol version 2 and got assignment:
Assignment{error=0,
leader='connect-1-9df69d2c-774d-4e34-aff2-5a2291a2210d',
leaderUrl='https://sqlserver-connect-cluster-1.sqlserver-connect-cluster.sqlserver-connect-cluster.svc.cluster.local:8083/',
offset=1076, connectorIds=[cdna-sql-party-source-connector],
taskIds=[cdna-sql-party-source-connector-0], revokedConnectorIds=[],
revokedTaskIds=[], delay=0} with rebalance delay: 0
[WARN] 2023-10-10 10:40:13,287 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder
handleRebalanceCompleted - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Catching up to assignment's
config offset.
[INFO] 2023-10-10 10:40:13,287 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder
readConfigToEnd - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Current config state offset -1
is behind group assignment 1076, reading to end of config log
[INFO] 2023-10-10 10:40:13,292 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder
refreshConfigSnapshot - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Finished reading to end of log
and updated config snapshot, new config log offset: 1076
[INFO] 2023-10-10 10:40:13,292 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder startWork -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Starting connectors and tasks using config offset 1076
[INFO] 2023-10-10 10:40:13,293 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.distributed.DistributedHerder startTask -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Starting task cdna-sql-party-source-connector-0
[INFO] 2023-10-10 10:40:13,294 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder
startConnector - [Worker clientId=connect-1,
groupId=stc.dev.connect.cluster.consumer.7] Starting connector
cdna-sql-party-source-connector
[INFO] 2023-10-10 10:40:13,305 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.Worker startConnector - Creating connector
cdna-sql-party-source-connector of type
io.debezium.connector.sqlserver.SqlServerConnector
[INFO] 2023-10-10 10:40:13,307 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker startTask - Creating task
cdna-sql-party-source-connector-0
[INFO] 2023-10-10 10:40:13,313 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.ConnectorConfig logAll - ConnectorConfig
values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
predicates = []
tasks.max = 1
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,315 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig
logAll - EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
predicates = []
tasks.max = 1
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,320 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.TaskConfig logAll - TaskConfig values:
task.class = class
io.debezium.connector.sqlserver.SqlServerConnectorTask

[INFO] 2023-10-10 10:40:13,321 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker startTask - Instantiated task
cdna-sql-party-source-connector-0 with version 2.2.1.Final of type
io.debezium.connector.sqlserver.SqlServerConnectorTask
[INFO] 2023-10-10 10:40:13,322 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.storage.StringConverterConfig logAll -
StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key

[INFO] 2023-10-10 10:40:13,322 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.json.JsonConverterConfig logAll -
JsonConverterConfig values:
converter.type = value
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false

[INFO] 2023-10-10 10:40:13,323 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker startTask - Set up the key
converter class org.apache.kafka.connect.storage.StringConverter for task
cdna-sql-party-source-connector-0 using the connector config
[INFO] 2023-10-10 10:40:13,323 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker startTask - Set up the value
converter class org.apache.kafka.connect.json.JsonConverter for task
cdna-sql-party-source-connector-0 using the connector config
[INFO] 2023-10-10 10:40:13,324 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker startTask - Set up the header
converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for
task cdna-sql-party-source-connector-0 using the worker config
[INFO] 2023-10-10 10:40:13,331 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.SourceConnectorConfig logAll -
SourceConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,331 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig
logAll - EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,335 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.clients.producer.ProducerConfig logAll - ProducerConfig
values:
acks = -1
batch.size = 16384
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = connector-producer-cdna-sql-party-source-connector-0
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 2147483647
enable.idempotence = false
interceptor.classes = []
key.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 0
max.block.ms = 9223372036854775807
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.adaptive.partitioning.enable = true
partitioner.availability.timeout.ms = 0
partitioner.class = null
partitioner.ignore.keys = false
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.ByteArraySerializer

[WARN] 2023-10-10 10:40:13,351 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.clients.producer.ProducerConfig logUnused - These
configurations '[metrics.context.resource.connector,
metrics.context.resource.version, metrics.context.connect.group.id,
metrics.context.resource.type, metrics.context.resource.commit.id,
metrics.context.resource.task, metrics.context.connect.kafka.cluster.id]'
were supplied but are not used yet.
[INFO] 2023-10-10 10:40:13,351 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:13,351 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:13,351 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934413351
[INFO] 2023-10-10 10:40:13,357 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.common.config.AbstractConfig logAll - AbstractConfig
values:
trace.records.enable = false
trace.records.header.converter = class
org.apache.kafka.connect.storage.SimpleHeaderConverter
trace.records.key.converter = class
org.apache.kafka.connect.json.JsonConverter
trace.records.predicates = []
trace.records.topic = connect-traces
trace.records.topic.partition = 1
trace.records.topic.replication.factor = 3
trace.records.transforms = []
trace.records.value.converter = class
org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,357 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.tracing.TracerConfig logAll - TracerConfig
values:
trace.records.enable = false
trace.records.header.converter = class
org.apache.kafka.connect.storage.SimpleHeaderConverter
trace.records.key.converter = class
org.apache.kafka.connect.json.JsonConverter
trace.records.predicates = []
trace.records.topic = connect-traces
trace.records.topic.partition = 1
trace.records.topic.replication.factor = 3
trace.records.transforms = []
trace.records.value.converter = class
org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,361 [StartAndStopExecutor-connect-1-2]
org.apache.kafka.connect.runtime.Worker doBuild - Initializing:
org.apache.kafka.connect.runtime.TransformationChain{}
[DEBUG] 2023-10-10 10:40:13,391
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask setTaskState - Setting task
state to 'INITIAL', previous state was 'INITIAL'
[INFO] 2023-10-10 10:40:13,395 [kafka-producer-network-thread |
connector-producer-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.Metadata update - [Producer
clientId=connector-producer-cdna-sql-party-source-connector-0] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[DEBUG] 2023-10-10 10:40:13,391
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask setTaskState - Setting task
state to 'INITIAL', previous state was 'INITIAL'
[INFO] 2023-10-10 10:40:13,399 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.SourceConnectorConfig logAll -
SourceConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,401 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig
logAll - EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,406 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.Worker startConnector - Instantiated
connector cdna-sql-party-source-connector with version 2.2.1.Final of type
class io.debezium.connector.sqlserver.SqlServerConnector
[INFO] 2023-10-10 10:40:13,410
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask start - Starting
SqlServerConnectorTask with configuration:
[INFO] 2023-10-10 10:40:13,416 [StartAndStopExecutor-connect-1-1]
org.apache.kafka.connect.runtime.Worker startConnector - Finished creating
connector cdna-sql-party-source-connector
[INFO] 2023-10-10 10:40:13,410
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask start - Starting
SqlServerConnectorTask with configuration:
[INFO] 2023-10-10 10:40:13,418
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[INFO] 2023-10-10 10:40:13,418
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[INFO] 2023-10-10 10:40:13,421
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.log.include.messages = true
[INFO] 2023-10-10 10:40:13,421
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.log.include.messages = true
[INFO] 2023-10-10 10:40:13,421
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - tasks.max =
1
[INFO] 2023-10-10 10:40:13,421
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - tasks.max =
1
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.context.headers.enable = true
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.context.headers.enable = true
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
topic.prefix = stc-con-grp-1
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
topic.prefix = stc-con-grp-1
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.truststore.location =
/mnt/sslcerts/truststore.p12
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.truststore.location =
/mnt/sslcerts/truststore.p12
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.kafka.topic = stc-sqlconnector-schemahistory
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.kafka.topic = stc-sqlconnector-schemahistory
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.security.protocol = SSL
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.security.protocol = SSL
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.key.password = ********
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.key.password = ********
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.truststore.password = ********
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.truststore.password = ********
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.topic.replication.factor = 1
[INFO] 2023-10-10 10:40:13,422
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.topic.replication.factor = 1
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.keystore.password = ********
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.keystore.password = ********
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
value.converter = org.apache.kafka.connect.json.JsonConverter
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
value.converter = org.apache.kafka.connect.json.JsonConverter
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.log.enable = true
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.log.enable = true
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.keystore.password = ********
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.keystore.password = ********
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
key.converter = org.apache.kafka.connect.storage.StringConverter
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
key.converter = org.apache.kafka.connect.storage.StringConverter
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.encrypt = false
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.encrypt = false
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.user = STC-CDNA-Ambit
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.user = STC-CDNA-Ambit
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.keystore.location =
/mnt/sslcerts/keystore.p12
[INFO] 2023-10-10 10:40:13,423
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.keystore.location =
/mnt/sslcerts/keystore.p12
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.names = AAFNFTE1
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.names = AAFNFTE1
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.truststore.password = ********
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.ssl.truststore.password = ********
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.kafka.bootstrap.servers =
b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.kafka.bootstrap.servers =
b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.port = 15010
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.port = 15010
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.key.password = ********
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.key.password = ********
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
key.converter.schemas.enable = false
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
key.converter.schemas.enable = false
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
topic.delimiter = -
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
topic.delimiter = -
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - task.class
= io.debezium.connector.sqlserver.SqlServerConnectorTask
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - task.class
= io.debezium.connector.sqlserver.SqlServerConnectorTask
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.hostname = 10.135.170.165
[INFO] 2023-10-10 10:40:13,424
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.hostname = 10.135.170.165
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.topic.name = stc.dlq.connectorpoc.cdnadevstc
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.deadletterqueue.topic.name = stc.dlq.connectorpoc.cdnadevstc
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.password = ********
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
database.password = ********
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - name =
cdna-sql-party-source-connector
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - name =
cdna-sql-party-source-connector
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.keystore.location =
/mnt/sslcerts/keystore.p12
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.keystore.location =
/mnt/sslcerts/keystore.p12
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
value.converter.schemas.enable = false
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
value.converter.schemas.enable = false
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.tolerance = all
[INFO] 2023-10-10 10:40:13,425
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
errors.tolerance = all
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.truststore.location =
/mnt/sslcerts/truststore.p12
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.producer.ssl.truststore.location =
/mnt/sslcerts/truststore.p12
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - task.id = 0
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 - task.id = 0
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
table.include.list = dbo.party
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
table.include.list = dbo.party
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.security.protocol = SSL
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask lambda$start$0 -
schema.history.internal.consumer.security.protocol = SSL
[INFO] 2023-10-10 10:40:13,426
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask
initializeAndStart - WorkerSourceTask{id=cdna-sql-party-source-connector-0}
Source task finished initialization and start
[INFO] 2023-10-10 10:40:13,428 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.distributed.DistributedHerder startWork -
[Worker clientId=connect-1, groupId=stc.dev.connect.cluster.consumer.7]
Finished starting connectors and tasks
[INFO] 2023-10-10 10:40:13,482
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask startIfNeededAndPossible -
Attempting to start task
[INFO] 2023-10-10 10:40:13,482
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask startIfNeededAndPossible -
Attempting to start task
[INFO] 2023-10-10 10:40:13,488 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.SourceConnectorConfig logAll -
SourceConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,494 [DistributedHerder-connect-1-1]
org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig
logAll - EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.debezium.connector.sqlserver.SqlServerConnector
errors.log.enable = true
errors.log.include.messages = true
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = all
exactly.once.support = requested
header.converter = null
key.converter = class
org.apache.kafka.connect.storage.StringConverter
name = cdna-sql-party-source-connector
offsets.storage.topic = null
predicates = []
tasks.max = 1
topic.creation.groups = []
transaction.boundary = poll
transaction.boundary.interval.ms = null
transforms = []
value.converter = class org.apache.kafka.connect.json.JsonConverter

[INFO] 2023-10-10 10:40:13,533
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.config.CommonConnectorConfig getTopicNamingStrategy - Loading
the custom topic naming strategy plugin:
io.debezium.schema.SchemaTopicNamingStrategy
[INFO] 2023-10-10 10:40:13,586
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.storage.kafka.history.KafkaSchemaHistory configure -
KafkaSchemaHistory Consumer config:
{key.deserializer=org.apache.kafka.common.serialization.StringDeserializer,
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer,
group.id=stc-con-grp-1-schemahistory,
ssl.keystore.location=/mnt/sslcerts/keystore.p12,
bootstrap.servers=b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
security.protocol=SSL, enable.auto.commit=false,
ssl.truststore.location=/mnt/sslcerts/truststore.p12,
ssl.keystore.password=********, ssl.key.password=********,
fetch.min.bytes=1, ssl.truststore.password=********,
session.timeout.ms=10000, auto.offset.reset=earliest,
client.id=stc-con-grp-1-schemahistory}
[INFO] 2023-10-10 10:40:13,587
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.storage.kafka.history.KafkaSchemaHistory configure -
KafkaSchemaHistory Producer config: {acks=1, batch.size=32768,
ssl.keystore.location=/mnt/sslcerts/keystore.p12,
bootstrap.servers=b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
buffer.memory=1048576,
key.serializer=org.apache.kafka.common.serialization.StringSerializer,
security.protocol=SSL, retries=1,
ssl.truststore.location=/mnt/sslcerts/truststore.p12,
value.serializer=org.apache.kafka.common.serialization.StringSerializer,
ssl.keystore.password=********, ssl.key.password=********,
max.block.ms=10000, ssl.truststore.password=********,
client.id=stc-con-grp-1-schemahistory, linger.ms=0}
[INFO] 2023-10-10 10:40:13,589
[task-thread-cdna-sql-party-source-connector-0] io.debezium.util.Threads
threadFactory - Requested thread factory for connector SqlServerConnector,
id = stc-con-grp-1 named = db-history-config-check
[INFO] 2023-10-10 10:40:13,591
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.producer.ProducerConfig
postProcessAndValidateIdempotenceConfigs - Idempotence will be disabled
because acks is set to 1, not set to 'all'.
[INFO] 2023-10-10 10:40:13,591
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.producer.ProducerConfig logAll - ProducerConfig
values:
acks = 1
batch.size = 32768
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
buffer.memory = 1048576
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = false
interceptor.classes = []
key.serializer = class
org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 10000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.adaptive.partitioning.enable = true
partitioner.availability.timeout.ms = 0
partitioner.class = null
partitioner.ignore.keys = false
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 1
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class
org.apache.kafka.common.serialization.StringSerializer

[INFO] 2023-10-10 10:40:13,622
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:13,625
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:13,625
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934413622
[INFO] 2023-10-10 10:40:13,636
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc-con-grp-1-schemahistory
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer

[INFO] 2023-10-10 10:40:13,680 [kafka-producer-network-thread |
stc-con-grp-1-schemahistory] org.apache.kafka.clients.Metadata update -
[Producer clientId=stc-con-grp-1-schemahistory] Cluster ID:
YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:13,691
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:13,691
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:13,691
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934413691
[INFO] 2023-10-10 10:40:13,720
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=stc-con-grp-1-schemahistory, groupId=stc-con-grp-1-schemahistory]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:13,848
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
resetStateAndGeneration - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Resetting generation and member id due
to: consumer pro-actively leaving the group
[INFO] 2023-10-10 10:40:13,849
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to: consumer
pro-actively leaving the group
[INFO] 2023-10-10 10:40:13,849
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:13,849
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:13,849
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:13,851
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.consumer for stc-con-grp-1-schemahistory unregistered
[INFO] 2023-10-10 10:40:13,873
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask getPreviousOffsets - Found
previous partition offset SqlServerPartition
[sourcePartition={server=stc-con-grp-1, database=AAFNFTE1}]:
{transaction_id=null, event_serial_no=0, commit_lsn=00006b12:000008b3:0002,
change_lsn=NULL}
[INFO] 2023-10-10 10:40:13,873
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask getPreviousOffsets - Found
previous partition offset SqlServerPartition
[sourcePartition={server=stc-con-grp-1, database=AAFNFTE1}]:
{transaction_id=null, event_serial_no=0, commit_lsn=00006b12:000008b3:0002,
change_lsn=NULL}
[INFO] 2023-10-10 10:40:13,875
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc-con-grp-1-schemahistory
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer

[INFO] 2023-10-10 10:40:13,889
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:13,889
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:13,890
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934413889
[INFO] 2023-10-10 10:40:13,928
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=stc-con-grp-1-schemahistory, groupId=stc-con-grp-1-schemahistory]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:14,025
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
resetStateAndGeneration - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Resetting generation and member id due
to: consumer pro-actively leaving the group
[INFO] 2023-10-10 10:40:14,025
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to: consumer
pro-actively leaving the group
[INFO] 2023-10-10 10:40:14,026
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:14,026
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:14,026
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:14,028
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.consumer for stc-con-grp-1-schemahistory unregistered
[INFO] 2023-10-10 10:40:14,029
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc-con-grp-1-schemahistory
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer

[INFO] 2023-10-10 10:40:14,034 [pool-7-thread-1]
io.debezium.jdbc.JdbcConnection lambda$doClose$4 - Connection gracefully
closed
[INFO] 2023-10-10 10:40:14,048
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:14,048
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:14,049
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934414048
[INFO] 2023-10-10 10:40:14,049
[task-thread-cdna-sql-party-source-connector-0] io.debezium.util.Threads
newThread - Creating thread
debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check
[INFO] 2023-10-10 10:40:14,051
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
org.apache.kafka.clients.admin.AdminClientConfig logAll - AdminClientConfig
values:
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory-topic-check
confluent.use.controller.listener = false
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
host.resolver.class = class
org.apache.kafka.clients.DefaultHostResolver
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 1
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS

[WARN] 2023-10-10 10:40:14,067
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
org.apache.kafka.clients.admin.AdminClientConfig logUnused - These
configurations '[acks, batch.size, buffer.memory, key.serializer,
value.serializer, max.block.ms, linger.ms]' were supplied but are not used
yet.
[INFO] 2023-10-10 10:40:14,067
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:14,067
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:14,067
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934414067
[INFO] 2023-10-10 10:40:14,078
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=stc-con-grp-1-schemahistory, groupId=stc-con-grp-1-schemahistory]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:14,108
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
resetStateAndGeneration - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Resetting generation and member id due
to: consumer pro-actively leaving the group
[INFO] 2023-10-10 10:40:14,108
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to: consumer
pro-actively leaving the group
[INFO] 2023-10-10 10:40:14,109
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:14,109
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:14,109
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:14,111
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.consumer for stc-con-grp-1-schemahistory unregistered
[INFO] 2023-10-10 10:40:14,112
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.relational.history.SchemaHistoryMetrics recoveryStarted -
Started database schema history recovery
[INFO] 2023-10-10 10:40:14,180
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.ConsumerConfig logAll - ConsumerConfig
values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers =
[b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = stc-con-grp-1-schemahistory
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = stc-con-grp-1-schemahistory
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor, class
org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = [hidden]
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = /mnt/sslcerts/keystore.p12
ssl.keystore.password = [hidden]
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = /mnt/sslcerts/truststore.p12
ssl.truststore.password = [hidden]
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.StringDeserializer

[INFO] 2023-10-10 10:40:14,197
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka version: 7.3.2-ce
[INFO] 2023-10-10 10:40:14,197
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka commitId:
6fa4ddf4260ddd15
[INFO] 2023-10-10 10:40:14,197
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser <init> - Kafka startTimeMs:
1696934414197
[INFO] 2023-10-10 10:40:14,198
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.KafkaConsumer subscribe - [Consumer
clientId=stc-con-grp-1-schemahistory, groupId=stc-con-grp-1-schemahistory]
Subscribed to topic(s): stc-sqlconnector-schemahistory
[INFO] 2023-10-10 10:40:14,222
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.Metadata update - [Consumer
clientId=stc-con-grp-1-schemahistory, groupId=stc-con-grp-1-schemahistory]
Cluster ID: YGTL-peLQIeAIy9hwXCoHg
[INFO] 2023-10-10 10:40:14,233 [kafka-admin-client-thread |
stc-con-grp-1-schemahistory-topic-check]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.admin.client for stc-con-grp-1-schemahistory-topic-check unregistered
[INFO] 2023-10-10 10:40:14,235 [kafka-admin-client-thread |
stc-con-grp-1-schemahistory-topic-check]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:14,235 [kafka-admin-client-thread |
stc-con-grp-1-schemahistory-topic-check]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:14,235 [kafka-admin-client-thread |
stc-con-grp-1-schemahistory-topic-check]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:14,237
[debezium-sqlserverconnector-stc-con-grp-1-db-history-config-check]
io.debezium.storage.kafka.history.KafkaSchemaHistory
lambda$checkTopicSettings$0 - Attempted to validate database schema history
topic but failed
java.util.concurrent.ExecutionException:
org.apache.kafka.common.errors.TopicAuthorizationException: Topic
authorization failed.
at
java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
at
java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2022)
at
org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:180)
at
io.debezium.storage.kafka.history.KafkaSchemaHistory.lambda$checkTopicSettings$0(KafkaSchemaHistory.java:423)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.common.errors.TopicAuthorizationException:
Topic authorization failed.
[INFO] 2023-10-10 10:40:14,251
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator onSuccess -
[Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Discovered group coordinator
b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093 (id: 2147483647 rack:
null)
[INFO] 2023-10-10 10:40:14,252
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
sendJoinGroupRequest - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] (Re-)joining group
[INFO] 2023-10-10 10:40:14,276
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to: need to
re-join with the given member-id:
stc-con-grp-1-schemahistory-2f898fd1-b91a-4171-b0f2-ae641b3e2516
[INFO] 2023-10-10 10:40:14,276
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to:
rebalance failed due to 'The group member needs to have a valid member id
before actually entering a consumer group.' (MemberIdRequiredException)
[INFO] 2023-10-10 10:40:14,276
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
sendJoinGroupRequest - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] (Re-)joining group
[INFO] 2023-10-10 10:40:17,280
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator handle -
[Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Successfully joined group with
generation Generation{generationId=1,
memberId='stc-con-grp-1-schemahistory-2f898fd1-b91a-4171-b0f2-ae641b3e2516',
protocol='range'}
[INFO] 2023-10-10 10:40:17,283
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
onLeaderElected - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Finished assignment for group at
generation 1:
{stc-con-grp-1-schemahistory-2f898fd1-b91a-4171-b0f2-ae641b3e2516=Assignment(partitions=[stc-sqlconnector-schemahistory-0])}
[INFO] 2023-10-10 10:40:17,292
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator handle -
[Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Successfully synced group in
generation Generation{generationId=1,
memberId='stc-con-grp-1-schemahistory-2f898fd1-b91a-4171-b0f2-ae641b3e2516',
protocol='range'}
[INFO] 2023-10-10 10:40:17,292
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
invokeOnAssignment - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Notifying assignor about the new
Assignment(partitions=[stc-sqlconnector-schemahistory-0])
[INFO] 2023-10-10 10:40:17,292
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
invokePartitionsAssigned - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Adding newly assigned partitions:
stc-sqlconnector-schemahistory-0
[INFO] 2023-10-10 10:40:17,298
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator handle -
[Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Found no committed offset for
partition stc-sqlconnector-schemahistory-0
[INFO] 2023-10-10 10:40:17,352
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.relational.history.SchemaHistoryMetrics onChangeFromHistory -
Database schema history recovery in progress, recovered 1 records
[INFO] 2023-10-10 10:40:17,394
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.relational.history.SchemaHistoryMetrics onChangeApplied -
Already applied 1 database changes
[INFO] 2023-10-10 10:40:17,416
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
invokePartitionsRevoked - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Revoke previously assigned partitions
stc-sqlconnector-schemahistory-0
[INFO] 2023-10-10 10:40:17,417
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
maybeLeaveGroup - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Member
stc-con-grp-1-schemahistory-2f898fd1-b91a-4171-b0f2-ae641b3e2516 sending
LeaveGroup request to coordinator
b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093 (id: 2147483647 rack:
null) due to the consumer is being closed
[INFO] 2023-10-10 10:40:17,418
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
resetStateAndGeneration - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Resetting generation and member id due
to: consumer pro-actively leaving the group
[INFO] 2023-10-10 10:40:17,418
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.clients.consumer.internals.ConsumerCoordinator
requestRejoin - [Consumer clientId=stc-con-grp-1-schemahistory,
groupId=stc-con-grp-1-schemahistory] Request joining group due to: consumer
pro-actively leaving the group
[INFO] 2023-10-10 10:40:17,426
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics scheduler closed
[INFO] 2023-10-10 10:40:17,426
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Closing reporter
org.apache.kafka.common.metrics.JmxReporter
[INFO] 2023-10-10 10:40:17,426
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.metrics.Metrics close - Metrics reporters closed
[INFO] 2023-10-10 10:40:17,428
[task-thread-cdna-sql-party-source-connector-0]
org.apache.kafka.common.utils.AppInfoParser unregisterAppInfo - App info
kafka.consumer for stc-con-grp-1-schemahistory unregistered
[INFO] 2023-10-10 10:40:17,429
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.relational.history.SchemaHistoryMetrics recoveryStopped -
Finished database schema history recovery of 6 change(s) in 3316 ms
[TRACE] 2023-10-10 10:40:17,437
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,437
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,438
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@375fb548
[TRACE] 2023-10-10 10:40:17,438
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@375fb548
[TRACE] 2023-10-10 10:40:17,441
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,441
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@1f9c79f
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@1f9c79f
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@59de658d
[TRACE] 2023-10-10 10:40:17,442
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@59de658d
[TRACE] 2023-10-10 10:40:17,443
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,443
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,443
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,443
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@16549416
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@16549416
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@28cc3b45
[TRACE] 2023-10-10 10:40:17,444
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@28cc3b45
[TRACE] 2023-10-10 10:40:17,445
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,445
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@205c773f
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@205c773f
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,446
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,447
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_rate' not found.
[WARN] 2023-10-10 10:40:17,447
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_rate' not found.
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@4a6b3a9d
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@4a6b3a9d
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,449
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@5d1224ed
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@5d1224ed
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,450
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@38a0dd5c
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@38a0dd5c
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,451
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@2405b760
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@2405b760
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@37a9ceaf
[TRACE] 2023-10-10 10:40:17,452
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@37a9ceaf
[TRACE] 2023-10-10 10:40:17,453
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,453
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,652
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,652
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,653
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,653
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 4700 ***
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 4700 ***
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@2c4bb386
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@2c4bb386
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 4700]
[TRACE] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 4700]
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,654
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 5406 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 5406 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@4df30d73
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@4df30d73
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 5406]
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 5406]
[WARN] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@11354dc6
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@11354dc6
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@622991b4
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@622991b4
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,655
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@130fd9f4
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@130fd9f4
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_code' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_code' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_amount' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_amount' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[WARN] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@4642d673
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@4642d673
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,656
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[WARN] 2023-10-10 10:40:17,682
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,682
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,682
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,682
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 15800 ***
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 15800 ***
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@53286221
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@53286221
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 15800]
[TRACE] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 15800]
[WARN] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[WARN] 2023-10-10 10:40:17,683
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@3e85d1c9
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@3e85d1c9
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@6ea46924
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@6ea46924
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@42cfe18
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@42cfe18
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@1405add6
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@1405add6
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[WARN] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,684
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[WARN] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@41e3d06
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@41e3d06
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@493e99e8
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@493e99e8
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[WARN] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[WARN] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_short_comment' not found.
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@2a8162e9
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@2a8162e9
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 17801 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 17801 ***
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@13d30172
[TRACE] 2023-10-10 10:40:17,685
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@13d30172
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 17801]
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 17801]
[WARN] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[WARN] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_extname' not found.
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@66d0e803
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@66d0e803
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@5b1ccce8
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@5b1ccce8
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,686
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,712
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,712
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,712
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,712
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[WARN] 2023-10-10 10:40:17,738
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,738
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,738
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,738
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@10895322
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@10895322
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@2d4fa657
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@2d4fa657
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,739
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@c1f798d
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@c1f798d
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@3cb8b2bd
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@3cb8b2bd
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@66f2e64b
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@66f2e64b
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 23100 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 23100 ***
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@365f89dc
[TRACE] 2023-10-10 10:40:17,740
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@365f89dc
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 23100]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 23100]
[WARN] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_amount' not found.
[WARN] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_amount' not found.
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@417b5d8b
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@417b5d8b
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@38af2243
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@38af2243
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@499e4ba3
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@499e4ba3
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@3c53a25
[TRACE] 2023-10-10 10:40:17,741
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@3c53a25
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@18c9a8d3
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@18c9a8d3
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[WARN] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_encrypted_30' not found.
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@59b53cb3
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@59b53cb3
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@62cda775
[TRACE] 2023-10-10 10:40:17,742
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@62cda775
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@48f3bd1b
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@48f3bd1b
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,743
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[WARN] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,769
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@6d09ec44
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@6d09ec44
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 90 ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 90 ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@6270089f
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@6270089f
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 90]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 90]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 39100 ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 39100 ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@373e7aaf
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@373e7aaf
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 39100]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 39100]
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,770
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@621d2185
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@621d2185
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@40d48a4e
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@40d48a4e
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@2202563b
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@2202563b
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@662dbbd6
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@662dbbd6
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,771
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[WARN] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[WARN] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Mapper for type 'axdt_name' not found.
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** false ***
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@6db7bf9f
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@6db7bf9f
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,772
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = false]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@955ce9d
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@955ce9d
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 64000 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 64000 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@39d6f202
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@39d6f202
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 64000]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 64000]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@32ae0de5
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@32ae0de5
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@62351768
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@62351768
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,773
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** true ***
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** true ***
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@45b31c95
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@45b31c95
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = true]
[TRACE] 2023-10-10 10:40:17,774
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = true]
[WARN] 2023-10-10 10:40:17,799
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,799
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@9784d6c
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1099/0x00000001018dd840@9784d6c
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,800
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[WARN] 2023-10-10 10:40:17,826
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[WARN] 2023-10-10 10:40:17,826
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Cannot parse column default value
'(CONVERT([datetime],'19000101',(112)))' to type 'datetime'. Expression
evaluation is not supported.
[DEBUG] 2023-10-10 10:40:17,826
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[DEBUG] 2023-10-10 10:40:17,826
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter
parseDefaultValue - Parsing failed due to error
com.microsoft.sqlserver.jdbc.SQLServerException: Error converting string
value 'ONVERT([datetime],'19000101',(112)' into data type datetime using
culture ''.
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet$FetchBuffer.nextRow(SQLServerResultSet.java:5479)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.fetchBufferNext(SQLServerResultSet.java:1798)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:1056)
at
io.debezium.jdbc.JdbcConnection.querySingleValue(JdbcConnection.java:1474)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.lambda$createDefaultValueMappers$22(SqlServerDefaultValueConverter.java:134)
at
io.debezium.connector.sqlserver.SqlServerDefaultValueConverter.parseDefaultValue(SqlServerDefaultValueConverter.java:73)
at
io.debezium.relational.TableSchemaBuilder.lambda$addField$9(TableSchemaBuilder.java:378)
at java.base/java.util.Optional.flatMap(Optional.java:294)
at
io.debezium.relational.TableSchemaBuilder.addField(TableSchemaBuilder.java:378)
at
io.debezium.relational.TableSchemaBuilder.lambda$create$2(TableSchemaBuilder.java:148)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
at
io.debezium.relational.TableSchemaBuilder.create(TableSchemaBuilder.java:146)
at
io.debezium.relational.RelationalDatabaseSchema.buildAndRegisterSchema(RelationalDatabaseSchema.java:122)
at
io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:74)
at
io.debezium.connector.sqlserver.SqlServerConnectorTask.start(SqlServerConnectorTask.java:88)
at
io.debezium.connector.common.BaseSourceTask.startIfNeededAndPossible(BaseSourceTask.java:244)
at
io.debezium.connector.common.BaseSourceTask.poll(BaseSourceTask.java:153)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.poll(AbstractWorkerSourceTask.java:470)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:349)
at
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:201)
at
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:256)
at
org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:75)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 0 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@1e366e3a
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@1e366e3a
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 0]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** true ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** true ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@8dda887
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1096/0x00000001018dcc40@8dda887
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = true]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = true]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 19300 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 19300 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@438587dd
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@438587dd
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 19300]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 19300]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 19300 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from data object: *** 19300 ***
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@6435edb4
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Callback is:
io.debezium.jdbc.JdbcValueConverters$$Lambda$1093/0x00000001018d9040@6435edb4
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 19300]
[TRACE] 2023-10-10 10:40:17,827
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.sqlserver.SqlServerValueConverters convertValue -
Value from ResultReceiver: [received = true, object = 19300]
[INFO] 2023-10-10 10:40:17,847
[task-thread-cdna-sql-party-source-connector-0] io.debezium.util.Threads
threadFactory - Requested thread factory for connector SqlServerConnector,
id = stc-con-grp-1 named = change-event-source-coordinator
[INFO] 2023-10-10 10:40:17,854
[task-thread-cdna-sql-party-source-connector-0] io.debezium.util.Threads
newThread - Creating thread
debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator
[INFO] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask startIfNeededAndPossible -
Successfully started task
[INFO] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask startIfNeededAndPossible -
Successfully started task
[DEBUG] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask setTaskState - Setting task
state to 'RUNNING', previous state was 'INITIAL'
[DEBUG] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.common.BaseSourceTask setTaskState - Setting task
state to 'RUNNING', previous state was 'INITIAL'
[DEBUG] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-10 10:40:17,855
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-10 10:40:17,856
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - no records available or
batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-10 10:40:17,856
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - no records available or
batch size not reached yet, sleeping a bit...
[INFO] 2023-10-10 10:40:17,859
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.pipeline.ChangeEventSourceCoordinator lambda$start$0 - Metrics
registered
[INFO] 2023-10-10 10:40:17,859
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.pipeline.ChangeEventSourceCoordinator lambda$start$0 - Context
created
[INFO] 2023-10-10 10:40:17,866
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource
getSnapshottingTask - A previous offset indicating a completed snapshot has
been found. Neither schema nor data will be snapshotted.
[INFO] 2023-10-10 10:40:17,866
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource
getSnapshottingTask - A previous offset indicating a completed snapshot has
been found. Neither schema nor data will be snapshotted.
[INFO] 2023-10-10 10:40:17,900
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.pipeline.ChangeEventSourceCoordinator doSnapshot - Snapshot
ended with SnapshotResult [status=SKIPPED, offset=SqlServerOffsetContext
[sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT},
sourceInfo=SourceInfo [serverName=stc-con-grp-1, changeLsn=NULL,
commitLsn=00006b12:000008b3:0002, eventSerialNo=null, snapshot=FALSE,
sourceTime=null], snapshotCompleted=true, eventSerialNo=0]]
[INFO] 2023-10-10 10:40:17,904
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.pipeline.ChangeEventSourceCoordinator streamingConnected -
Connected metrics set to 'true'
[INFO] 2023-10-10 10:40:17,905
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerChangeEventSourceCoordinator
executeChangeEventSources - Starting streaming
[INFO] 2023-10-10 10:40:17,905
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerChangeEventSourceCoordinator
executeChangeEventSources - Starting streaming
[INFO] 2023-10-10 10:40:17,907
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - Last position recorded in offsets is
00006b12:000008b3:0002(NULL)[0]
[INFO] 2023-10-10 10:40:17,907
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - Last position recorded in offsets is
00006b12:000008b3:0002(NULL)[0]
[TRACE] 2023-10-10 10:40:18,104
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerConnection
lambda$getMaxTransactionLsn$6 - Max transaction lsn is
00006b12:0000080f:0005
[TRACE] 2023-10-10 10:40:18,104
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerConnection
lambda$getMaxTransactionLsn$6 - Max transaction lsn is
00006b12:0000080f:0005
[DEBUG] 2023-10-10 10:40:18,104
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - No change in the database
[DEBUG] 2023-10-10 10:40:18,104
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - No change in the database
[DEBUG] 2023-10-10 10:40:18,357
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - checking for more
records...
[DEBUG] 2023-10-10 10:40:18,357
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - checking for more
records...
[DEBUG] 2023-10-10 10:40:18,357
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-10 10:40:18,357
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - polling records...
[DEBUG] 2023-10-10 10:40:18,358
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - no records available or
batch size not reached yet, sleeping a bit...
[DEBUG] 2023-10-10 10:40:18,358
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - no records available or
batch size not reached yet, sleeping a bit...
[TRACE] 2023-10-10 10:40:18,431
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerConnection
lambda$getMaxTransactionLsn$6 - Max transaction lsn is
00006b12:0000080f:0005
[TRACE] 2023-10-10 10:40:18,431
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerConnection
lambda$getMaxTransactionLsn$6 - Max transaction lsn is
00006b12:0000080f:0005
[DEBUG] 2023-10-10 10:40:18,431
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - No change in the database
[DEBUG] 2023-10-10 10:40:18,431
[debezium-sqlserverconnector-stc-con-grp-1-change-event-source-coordinator]
io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource
executeIteration - No change in the database
[DEBUG] 2023-10-10 10:40:18,858
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - checking for more
records...
[DEBUG] 2023-10-10 10:40:18,858
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - checking for more
records...
[DEBUG] 2023-10-10 10:40:18,858
[task-thread-cdna-sql-party-source-connector-0]
io.debezium.connector.base.ChangeEventQueue poll - no records available or
batch size not reached yet, sleeping a bit...

>>>> <https://groups.google.com/d/msgid/debezium/e9dd6323-9e3d-4b4f-ad79-ca9661728f79n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>>> .
>>>>
>>>>
>>>> --
>>> You received this message because you are subscribed to the Google
>>> Groups "debezium" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to debezium+u...@googlegroups.com.
>>>
>>> To view this discussion on the web visit
>>> https://groups.google.com/d/msgid/debezium/829012cd-a0bd-42bc-a6ea-39037cfc087an%40googlegroups.com
>>> <https://groups.google.com/d/msgid/debezium/829012cd-a0bd-42bc-a6ea-39037cfc087an%40googlegroups.com?utm_medium=email&utm_source=footer>
>>> .
>>>
>>>
>>>

Aqhil Mohammad

unread,
Oct 11, 2023, 7:57:09 AM10/11/23
to debezium
Hi,

I think I got the problem, issue is CDC is not enable properly on tables.

When I'm running this query, it's not returning anything

<schema>_<table_name>_CT

also
SELECT * FROM msdb.dbo.cdc_jobs;

database_id  job_type  job_id  maxtrans  maxscans  continuous  
pollinginterval  retention  threshold
6  cleanup  8BA6B4AA-C1E1-4427-B1A5-8A41EFA17FB2  0  0  0  0  4320  5000


Do we required Capture job to be enabled, we have job enabled for
transactional replication and CDC.
> size_t HeapBaseMinAddress = 2147483648 <(214)%20748-3648>
> size_t ZMarkStacksMax = 8589934592 <(858)%20993-4592>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> retries = 2147483647 <(214)%20748-3647>
> delivery.timeout.ms = 2147483647 <(214)%20748-3647>
> enable.idempotence = false
> interceptor.classes = []
> key.serializer = class
> org.apache.kafka.common.serialization.ByteArraySerializer
> linger.ms = 0
> max.block.ms = 60000
> max.in.flight.requests.per.connection = 1
> max.request.size = 1048576
> metadata.max.age.ms = 300000
> metadata.max.idle.ms = 300000
> metric.reporters = []
> metrics.num.samples = 2
> metrics.recording.level = INFO
> metrics.sample.window.ms = 30000
> partitioner.adaptive.partitioning.enable = true
> partitioner.availability.timeout.ms = 0
> partitioner.class = null
> partitioner.ignore.keys = false
> receive.buffer.bytes = 32768
> reconnect.backoff.max.ms = 1000
> reconnect.backoff.ms = 50
> request.timeout.ms = 20000
> retries = 2147483647 <(214)%20748-3647>
> delivery.timeout.ms = 2147483647 <(214)%20748-3647>
> enable.idempotence = false
> interceptor.classes = []
> key.serializer = class
> org.apache.kafka.common.serialization.StringSerializer
> linger.ms = 0
> max.block.ms = 60000
> max.in.flight.requests.per.connection = 1
> max.request.size = 1048576
> metadata.max.age.ms = 300000
> metadata.max.idle.ms = 300000
> metric.reporters = []
> metrics.num.samples = 2
> metrics.recording.level = INFO
> metrics.sample.window.ms = 30000
> partitioner.adaptive.partitioning.enable = true
> partitioner.availability.timeout.ms = 0
> partitioner.class = null
> partitioner.ignore.keys = false
> receive.buffer.bytes = 32768
> reconnect.backoff.max.ms = 1000
> reconnect.backoff.ms = 50
> request.timeout.ms = 20000
> retries = 2147483647 <(214)%20748-3647>
> <(214)%20748-3645> rack:
> delivery.timeout.ms = 2147483647 <(214)%20748-3647>
> enable.idempotence = false
> interceptor.classes = []
> key.serializer = class
> org.apache.kafka.common.serialization.ByteArraySerializer
> linger.ms = 0
> max.block.ms = 9223372036854775807
> max.in.flight.requests.per.connection = 1
> max.request.size = 1048576
> metadata.max.age.ms = 300000
> metadata.max.idle.ms = 300000
> metric.reporters = []
> metrics.num.samples = 2
> metrics.recording.level = INFO
> metrics.sample.window.ms = 30000
> partitioner.adaptive.partitioning.enable = true
> partitioner.availability.timeout.ms = 0
> partitioner.class = null
> partitioner.ignore.keys = false
> receive.buffer.bytes = 32768
> reconnect.backoff.max.ms = 1000
> reconnect.backoff.ms = 50
> request.timeout.ms = 30000
> retries = 2147483647 <(214)%20748-3647>
> [INFO] 2023-10-10 10:40:13,424
> [task-thread-cdna-sql-party-source-connector-0]
> io.debezium.connector.common.BaseSourceTask lambda$start$0 -
> schema.history.internal.kafka.bootstrap.servers =
> <(214)%20748-3647> rack:
> <(214)%20748-3647> rack:

jiri.p...@gmail.com

unread,
Oct 11, 2023, 8:18:36 AM10/11/23
to debezium
Yes, the capture job must be running

J.
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > delivery.timeout.ms = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > enable.idempotence = false
> > interceptor.classes = []
> > key.serializer = class
> > org.apache.kafka.common.serialization.ByteArraySerializer
> > linger.ms = 0
> > max.block.ms = 60000
> > max.in.flight.requests.per.connection = 1
> > max.request.size = 1048576
> > metadata.max.age.ms = 300000
> > metadata.max.idle.ms = 300000
> > metric.reporters = []
> > metrics.num.samples = 2
> > metrics.recording.level = INFO
> > metrics.sample.window.ms = 30000
> > partitioner.adaptive.partitioning.enable = true
> > partitioner.availability.timeout.ms = 0
> > partitioner.class = null
> > partitioner.ignore.keys = false
> > receive.buffer.bytes = 32768
> > reconnect.backoff.max.ms = 1000
> > reconnect.backoff.ms = 50
> > request.timeout.ms = 20000
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > delivery.timeout.ms = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > enable.idempotence = false
> > interceptor.classes = []
> > key.serializer = class
> > org.apache.kafka.common.serialization.StringSerializer
> > linger.ms = 0
> > max.block.ms = 60000
> > max.in.flight.requests.per.connection = 1
> > max.request.size = 1048576
> > metadata.max.age.ms = 300000
> > metadata.max.idle.ms = 300000
> > metric.reporters = []
> > metrics.num.samples = 2
> > metrics.recording.level = INFO
> > metrics.sample.window.ms = 30000
> > partitioner.adaptive.partitioning.enable = true
> > partitioner.availability.timeout.ms = 0
> > partitioner.class = null
> > partitioner.ignore.keys = false
> > receive.buffer.bytes = 32768
> > reconnect.backoff.max.ms = 1000
> > reconnect.backoff.ms = 50
> > request.timeout.ms = 20000
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > delivery.timeout.ms = 2147483647 <(214)%20748-3647> <(214)%20748-3647>
> > enable.idempotence = false
> > interceptor.classes = []
> > key.serializer = class
> > org.apache.kafka.common.serialization.ByteArraySerializer
> > linger.ms = 0
> > max.block.ms = 9223372036854775807
> > max.in.flight.requests.per.connection = 1
> > max.request.size = 1048576
> > metadata.max.age.ms = 300000
> > metadata.max.idle.ms = 300000
> > metric.reporters = []
> > metrics.num.samples = 2
> > metrics.recording.level = INFO
> > metrics.sample.window.ms = 30000
> > partitioner.adaptive.partitioning.enable = true
> > partitioner.availability.timeout.ms = 0
> > partitioner.class = null
> > partitioner.ignore.keys = false
> > receive.buffer.bytes = 32768
> > reconnect.backoff.max.ms = 1000
> > reconnect.backoff.ms = 50
> > request.timeout.ms = 30000
> > retries = 2147483647 <(214)%20748-3647> <(214)%20748-3647>

Aqhil Mohammad

unread,
Oct 13, 2023, 12:45:58 PM10/13/23
to debezium
Hello,

After enabling capture job, it started capturing events.
> > > b2.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
> > > security.protocol=SSL, enable.auto.commit=false,
> > > ssl.truststore.location=/mnt/sslcerts/truststore.p12,
> > > ssl.keystore.password=********, ssl.key.password=********,
> > > fetch.min.bytes=1, ssl.truststore.password=********,
> > > session.timeout.ms=10000, auto.offset.reset=earliest,
> > > client.id=stc-con-grp-1-schemahistory}
> > > [INFO] 2023-10-10 10:40:13,587
> > > [task-thread-cdna-sql-party-source-connector-0]
> > > io.debezium.storage.kafka.history.KafkaSchemaHistory configure -
> > > KafkaSchemaHistory Producer config: {acks=1, batch.size=32768,
> > > ssl.keystore.location=/mnt/sslcerts/keystore.p12,
> > > bootstrap.servers=
> b0.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
> > > b1.dev-k4a.recp-da-kafka-dev.shared.banksvcs.net:9093,
Reply all
Reply to author
Forward
0 new messages