Switched to StringSerializer, but its still failing with the same error.
Unfortunately my employer will not allow me to just post the entire log. Here are the snippets I think are most relevant.
This appears to show I'm logged into Kafka:
krb5loginmodule: principal is [principal]@[domain]
krb5loginmodule: Will use keytab
krb5loginmodule: Commit Succeeded
2024-11-07 13:39:30,941 INFO [org.apa.kaf.com.sec.aut.AbstractLogin] (main) Successfully logged in.
2024-11-07 13:39:30,943 INFO [org.apa.kaf.com.sec.ker.KerberosLogin] (kafka-kerberos-refresh-thread-[principal]@[domain]) [Principal=[principal]@[domain]]: TGT refresh thread started.
2024-11-07 13:39:30,949 INFO [org.apa.kaf.com.sec.ker.KerberosLogin] (kafka-kerberos-refresh-thread-[principal]@[domain]) [Principal=[principal]@[domain]]: TGT valid starting at: Thu Nov 07 13:39:30 CST 2024
2024-11-07 13:39:30,949 INFO [org.apa.kaf.com.sec.ker.KerberosLogin] (kafka-kerberos-refresh-thread-[principal]@[domain]) [Principal=[principal]@[domain]]: TGT expires: Thu Nov 07 23:39:30 CST 2024
2024-11-07 13:39:30,949 INFO [org.apa.kaf.com.sec.ker.KerberosLogin] (kafka-kerberos-refresh-thread-[principal]@[domain]) [Principal=[principal]@[domain]]: TGT refresh sleeping until: Thu Nov 07 21:49:31 CST 2024
2024-11-07 13:39:30,977 INFO [org.apa.kaf.com.uti.AppInfoParser] (main) Kafka version: 3.8.0
2024-11-07 13:39:30,978 INFO [org.apa.kaf.com.uti.AppInfoParser] (main) Kafka commitId: 771b9576b00ecf5b
2024-11-07 13:39:30,978 INFO [org.apa.kaf.com.uti.AppInfoParser] (main) Kafka startTimeMs: 1731008370976
2024-11-07 13:39:30,980 INFO [io.deb.ser.kaf.KafkaChangeConsumer] (main) consumer started...
2024-11-07 13:39:30,980 INFO [io.deb.ser.DebeziumServer] (main) Consumer 'io.debezium.server.kafka.KafkaChangeConsumer' instantiated
Then, everything seems OK, until this single error pops up right before the SQL Server connection and the schema sync process then proceeds. This is the only indicator of an error until the end of Step 7 in the schema sync process later:
2024-11-07 13:39:32,352 INFO [org.apa.kaf.cli.pro.int.TransactionManager] (kafka-producer-network-thread | producer-1) [Producer clientId=producer-1] Transiting to fatal error state due to org.apache.kafka.common.KafkaException: Unexpected error in InitProducerIdResponse; The server experienced an unexpected error when processing the request.
2024-11-07 13:39:32,496 INFO [io.deb.jdb.JdbcConnection] (pool-12-thread-1) Connection gracefully closed
2024-11-07 13:39:32,508 INFO [io.deb.emb.asy.AsyncEmbeddedEngine] (pool-7-thread-1) Engine state has changed from 'CREATING_TASKS' to 'STARTING_TASKS'
2024-11-07 13:39:32,510 INFO [io.deb.emb.asy.AsyncEmbeddedEngine] (pool-7-thread-1) Waiting max. for 180000 ms for individual source tasks to start.
2024-11-07 13:39:32,511 INFO [io.deb.con.com.BaseSourceTask] (pool-8-thread-1) Starting SqlServerConnectorTask with configuration:
[ many more lines of successful extraction of the DDL for my single table i've configured in my table list]
Here is where it fails after Step 7 of schema sync:
2024-11-07 13:39:34,059 INFO [io.deb.rel.RelationalSnapshotChangeEventSource] (pool-13-thread-1) Exporting data from table '[DB].[schema].[Table]' (1 of 1 tables)
2024-11-07 13:39:34,816 WARN [org.apa.kaf.cli.NetworkClient] (kafka-producer-network-thread | producer-1) [Producer clientId=producer-1] Error while fetching metadata with correlation id 6 : {{Topic Prefix Only}=TOPIC_AUTHORIZATION_FAILED}
2024-11-07 13:39:34,818 ERROR [org.apa.kaf.cli.Metadata] (kafka-producer-network-thread | producer-1) [Producer clientId=producer-1] Topic authorization failed for topics [{Topic Prefix Only}]
2024-11-07 13:39:34,819 ERROR [io.deb.emb.asy.AsyncEmbeddedEngine] (pool-7-thread-1) Engine has failed with : java.util.concurrent.ExecutionException: io.debezium.DebeziumException: java.lang.ClassCastException: class java.lang.String cannot be cast to class [B (java.lang.String and [B are in module java.base of loader 'bootstrap')
The odd part is the topic name in the next to last entry is only the prefix. I substituted the actual prefix for {Topic Prefix Only} in the log entry above to anonymize it.
The name was not the entire PREFIX-SCHEMA-TableName that I expected. Is it really trying to use a schema that does not exist ?
Also curious is the fact that even though it fails with this error and later states the schema sync failed, it does update the table with the latest schema for the table it is supposed to replicate.
PREFIX-DBName-SchemaName-TableName
In addition, there was a Q that was named solely the prefix name.
So I got these exact same topics created in Kafka - even preserving the case in the name, since my DB and schema names contain lower-case letters. Debug logging also confirms that my user is authenticated and connected to the Kafka cluster.
2024-11-13 15:20:07,866 INFO [org.apa.kaf.com.sec.aut.AbstractLogin] (main) Successfully logged in.
Now I'm back to the conversion errors whether I use StringSerializer or ByteArraySerializer. I also tried all the other available serializer types to no avail.
When using String, the final error occurs at the very end of the log, even after the engine appears to have shut down.
2024-11-13 15:11:32,758 ERROR [org.apa.kaf.cli.pro.int.ProducerBatch] (kafka-producer-network-thread | producer-1) Error executing user-provided callback on message for topic-partition 'MFG-14986-FLEXNET-DEV-0': java.lang.ClassCastException: class java.lang.String cannot be cast to class [B (java.lang.String and [B are in module java.base of loader 'bootstrap')
When using ByteArray, that error happens immediately after everything necessary to create the initial schema definition message has finished. The error itself is also slightly different:
2024-11-13 15:20:11,642 ERROR [io.deb.emb.asy.AsyncEmbeddedEngine] (pool-7-thread-1) Engine has failed with : java.util.concurrent.ExecutionException: io.debezium.DebeziumException: org.apache.kafka.common.errors.SerializationException: Can't convert key of class java.lang.String to class org.apache.kafka.common.serialization.ByteArraySerializer specified in key.serializer