Hello Debezium Team,
I am currently implementing CDC from Oracle to Kafka using the Debezium Oracle connector and encountered an issue related to XMLTYPE columns stored as CLOB.
EnvironmentOracle Database: 12c
Connector Mode: LogMiner
Kafka: 3.9.1
Source Application: Temenos T24 Core Banking (TAFJ schema)
The table involved contains an XML payload column:
Schema: TAFJ
Table: FBNK_ACCOUNT
Column: XMLRECORD (XMLTYPE stored as CLOB)
The connector starts successfully.
The initial snapshot completes successfully, and records are captured correctly from the Oracle database.
However, when new records are inserted or existing records are updated, the connector task fails during streaming.
The failure occurs when Debezium processes redo logs containing updates to the XMLTYPE column.
ErrorThe connector fails with the following error:
org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. Caused by: io.debezium.text.ParsingException: Parsing failed for SQL: update "TAFJ"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '70000000001'; Caused by: java.lang.IllegalStateException: Failed to locate preamble: XML DOC BEGIN: Important ObservationSnapshot ingestion works correctly.
Streaming CDC fails when updates occur on XMLTYPE columns.
This suggests the issue is specifically related to LogMiner parsing of XMLTYPE redo entries during streaming.
Connector Configuration (Key Parameters)connector.class=io.debezium.connector.oracle.OracleConnector database.connection.adapter=logminer lob.enabled=true table.include.list=TAFJ.FBNK_ACCOUNT column.include.list=TAFJ.FBNK_ACCOUNT.RECID,TAFJ.FBNK_ACCOUNT.XMLRECORD snapshot.mode=initial snapshot.locking.mode=none Versions TestedTo rule out version-specific issues, I tested multiple Debezium connector versions:
debezium-connector-oracle-2.5.4.Final-plugin
debezium-connector-oracle-2.6.2.Final-plugin
debezium-connector-oracle-2.7.4.Final-plugin
debezium-connector-oracle-3.4.0.Final-plugin
debezium-connector-oracle-3.4.1.Final-plugin
The same behavior occurs across all these versions.
Oracle ConfigurationThe database has been configured according to Debezium documentation:
ARCHIVELOG mode enabled
Supplemental logging enabled (including table-level logging)
Required LogMiner privileges granted to the Debezium user
According to Debezium documentation and previous discussions, support for LOB/XMLTYPE handling has been improved. However, the issue still occurs during streaming updates.
Since Temenos T24 stores core business records in XMLTYPE fields, this scenario is common in banking environments.
Could you please advise:
Whether this is a known limitation or bug with XMLTYPE handling in LogMiner.
If there are recommended configuration changes or workarounds.
Whether using XStream instead of LogMiner would handle this case more reliably.
A detailed file containing the connector status, configuration, and error trace is attached for reference.
Thank you for your support.
Best regards
Zelalem Solomon
Hi Chris,
Thank you for the follow-up. Below are the requested details to help reproduce the issue involving XMLTYPE handling during CDC streaming.
Debezium Connector Configuration
Connector name: oracle-debezium-account
{
"name": "oracle-debezium-account",
"connector.class": "io.debezium.connector.oracle.OracleConnector",
"tasks.max": "1",
"database.hostname": "",
"database.port": "1521",
"database.user": "",
"database.password": "",
"database.dbname": "",
"database.connection.adapter": "logminer",
"snapshot.mode": "initial",
"snapshot.locking.mode": "none",
"snapshot.fetch.size": "1000",
"snapshot.max.threads": "1",
"lob.enabled": "true",
"table.include.list": "TAFJ.FBNK_ACCOUNT",
"column.include.list": "TAFJ.FBNK_ACCOUNT.RECID,TAFJ.FBNK_ACCOUNT.XMLRECORD",
"topic.prefix": "",
"schema.history.internal.kafka.bootstrap.servers": "",
"schema.history.internal.kafka.topic": "schema-changes.oracle-ac",
"schema.history.internal.store.only.captured.tables.ddl": "true",
"log.mining.buffer.size": "10485760",
"errors.tolerance": "none"
}
Connector plugin version:
Debezium Oracle Connector 3.4.0.Final
Kafka Connect version:
Apache Kafka 3.9.1
Table DDL
Schema: TAFJ
Table: FBNK_ACCOUNT
CREATE TABLE "TAFJ"."FBNK_ACCOUNT"
(
"RECID" VARCHAR2(255) NOT NULL,
"XMLRECORD" SYS.XMLTYPE,
PRIMARY KEY ("RECID")
)
XMLTYPE COLUMN "XMLRECORD" STORE AS SECUREFILE CLOB
(
ENABLE STORAGE IN ROW
)
PARTITION BY HASH ("RECID");
Supplemental logging is enabled:
SUPPLEMENTAL LOG DATA (ALL) COLUMNS
SQL Statements Executed by the Source Application
The following SQL statements were captured from the Oracle shared SQL area during normal transactions.
Insert:
INSERT INTO FBNK_ACCOUNT(XMLRECORD, RECID)
VALUES (:1, :2)
Update:
UPDATE FBNK_ACCOUNT
SET XMLRECORD = :1
WHERE RECID = :2
Delete (during authorization workflow):
DELETE FROM FBNK_ACCOUNT#NAU
WHERE RECID IN (:1)
Example RECID observed:
7000001
The XML payload is passed as bind variable (:1) and stored in the XMLRECORD XMLTYPE column.
Observed LogMiner Redo SQL (Failure Case)
During streaming, Debezium processes the following SQL reconstructed from redo logs:
update "TAFJ"."FBNK_ACCOUNT" a
set a."XMLRECORD" = XMLType(:1)
where a."RECID" = '7000001';
Connector failure:
io.debezium.text.ParsingException: Parsing failed for SQL
Failed to locate preamble: XML DOC BEGIN
The snapshot phase completes successfully and records are captured correctly.
The failure occurs only during streaming when updates to the XMLTYPE column are processed from LogMiner redo logs.
Please let me know if additional information is required.
Best regards
Zelalem Solomon
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/debezium/8cb934a0-0b63-4188-8594-29519f53a174n%40googlegroups.com.
Hi Chris,
Thank you for quickly providing solutions and giving a prioritized attention to the issues.
I have tested the fix using the latest Debezium 3.4.3-SNAPSHOT (Oracle Connector) which I have built from the source, but unfortunately, I am still encountering a ParsingException during the streaming phase.
Environment Details:
Debezium Version: 3.4.3-SNAPSHOT (verified via /connector-plugins endpoint)
Kafka Connect: 3.9.1
Oracle Version: 12c (On-prem)
Storage: XMLTYPE stored as SECUREFILE CLOB
Observed Error: The connector fails as soon as an update is processed on the FBNK_ACCOUNT table. The stack trace indicates that the parser is still looking for a preamble that it cannot find:
SQL captured from V$SQL during the failure: The application is executing the following: UPDATE FBNK_ACCOUNT SET XMLRECORD=:1 WHERE RECID = :2
However, LogMiner reconstructs it in the redo logs with an alias and explicit XMLType constructor: update "R20MASTER"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '7000050346124';
It appears that AbstractSelectSingleColumnSqlRedoPreambleParser is still expecting the "Binary" style preamble (XML DOC BEGIN:) and isn't successfully pivoting to the new CLOB-based logic for this specific SQL string.
I have verified that all jars in my plugin directory are updated to the 3.4.3-SNAPSHOT version. Please let me know if there are specific log categories I should enable or if more details on the redo log contents are needed.
I have attached the errors and configuration details and Oracle database SQL executions.
Best regards,
Zelalem
Hi Chris,
Thank you for quickly providing solutions and giving a prioritized attention to the issues.
I have tested the fix using the latest Debezium 3.4.3-SNAPSHOT (Oracle Connector) which I have built from the source, but unfortunately, I am still encountering a ParsingException during the streaming phase.
Environment Details:
Debezium Version: 3.4.3-SNAPSHOT (verified via /connector-plugins endpoint)
Kafka Connect: 3.9.1
Oracle Version: 12c (On-prem)
Storage: XMLTYPE stored as SECUREFILE CLOB
Observed Error: The connector fails as soon as an update is processed on the FBNK_ACCOUNT table. The stack trace indicates that the parser is still looking for a preamble that it cannot find:
PlaintextCaused by: io.debezium.text.ParsingException: Parsing failed for SQL: 'update "R20MASTER"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '7000050346124';' at io.debezium.connector.oracle.logminer.parser.AbstractSingleColumnSqlRedoPreambleParser.parse(AbstractSingleColumnSqlRedoPreambleParser.java:70)... Caused by: java.lang.IllegalStateException: Failed to locate preamble: XML DOC BEGIN: at io.debezium.connector.oracle.logminer.parser.AbstractSelectSingleColumnSqlRedoPreambleParser.parseInternal(AbstractSelectSingleColumnSqlRedoPreambleParser.java:33)
To view this discussion visit https://groups.google.com/d/msgid/debezium/4a2a1ef5-939e-48c7-bc3d-be7dc0ceb52an%40googlegroups.com.
Hi Chris,
Thank you again for the quick turnaround and for pointing me to the correct PR.
I’ve now rebuilt the connector using the PR branch (pull/7189, locally checked out as pr-7189) and performed a preliminary test on our environment (Oracle 12c with XMLTYPE stored as SECUREFILE CLOB, LogMiner-based CDC).
I’m happy to report that the initial results are positive — the connector is now able to process the update events without encountering the previous ParsingException, and the XML data appears to be handled correctly.
This is very encouraging and directly addresses our use case.
I’ll continue with more thorough and extended testing (including different workloads and edge cases) and will share any further findings if I encounter issues.
Thanks again for your support and for the excellent work on this fix — much appreciated.
Best regards,
Zelalem