Oracle XMLTYPE (CLOB) streaming failure with LogMiner – snapshot works but CDC update fails

105 views
Skip to first unread message

Zelalem Solomon

unread,
Mar 16, 2026, 3:28:31 AMMar 16
to debezium

Hello Debezium Team,

I am currently implementing CDC from Oracle to Kafka using the Debezium Oracle connector and encountered an issue related to XMLTYPE columns stored as CLOB.

Environment
  • Oracle Database: 12c

  • Connector Mode: LogMiner

  • Kafka: 3.9.1

  • Source Application: Temenos T24 Core Banking (TAFJ schema)

The table involved contains an XML payload column:

Schema: TAFJ
Table: FBNK_ACCOUNT
Column: XMLRECORD (XMLTYPE stored as CLOB)

Observed Behavior
  1. The connector starts successfully.

  2. The initial snapshot completes successfully, and records are captured correctly from the Oracle database.

  3. However, when new records are inserted or existing records are updated, the connector task fails during streaming.

The failure occurs when Debezium processes redo logs containing updates to the XMLTYPE column.

Error

The connector fails with the following error:

org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. Caused by: io.debezium.text.ParsingException: Parsing failed for SQL: update "TAFJ"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '70000000001'; Caused by: java.lang.IllegalStateException: Failed to locate preamble: XML DOC BEGIN: Important Observation
  • Snapshot ingestion works correctly.

  • Streaming CDC fails when updates occur on XMLTYPE columns.

This suggests the issue is specifically related to LogMiner parsing of XMLTYPE redo entries during streaming.

Connector Configuration (Key Parameters)connector.class=io.debezium.connector.oracle.OracleConnector database.connection.adapter=logminer lob.enabled=true table.include.list=TAFJ.FBNK_ACCOUNT column.include.list=TAFJ.FBNK_ACCOUNT.RECID,TAFJ.FBNK_ACCOUNT.XMLRECORD snapshot.mode=initial snapshot.locking.mode=none Versions Tested

To rule out version-specific issues, I tested multiple Debezium connector versions:

  • debezium-connector-oracle-2.5.4.Final-plugin

  • debezium-connector-oracle-2.6.2.Final-plugin

  • debezium-connector-oracle-2.7.4.Final-plugin

  • debezium-connector-oracle-3.4.0.Final-plugin

  • debezium-connector-oracle-3.4.1.Final-plugin

The same behavior occurs across all these versions.

Oracle Configuration

The database has been configured according to Debezium documentation:

  • ARCHIVELOG mode enabled

  • Supplemental logging enabled (including table-level logging)

  • Required LogMiner privileges granted to the Debezium user

Request

According to Debezium documentation and previous discussions, support for LOB/XMLTYPE handling has been improved. However, the issue still occurs during streaming updates.

Since Temenos T24 stores core business records in XMLTYPE fields, this scenario is common in banking environments.

Could you please advise:

  1. Whether this is a known limitation or bug with XMLTYPE handling in LogMiner.

  2. If there are recommended configuration changes or workarounds.

  3. Whether using XStream instead of LogMiner would handle this case more reliably.

A detailed file containing the connector status, configuration, and error trace is attached for reference.

Thank you for your support.

Best regards

Zelalem Solomon


Debezium Error.txt

Chris Cranford

unread,
Mar 16, 2026, 7:40:59 AMMar 16
to debezium
Hi -

Can you provide your full connector configuration, the full DDL for the table FBNK_ACCOUNT and how exactly is the XMLRECORD being inserted or updated in the table? We need to be able to reproduce this in the tutorial and need that specific information.

Thanks
-cc

Zelalem Solomon

unread,
Mar 16, 2026, 11:35:44 AMMar 16
to debezium

Hi Chris,

Thank you for the follow-up. Below are the requested details to help reproduce the issue involving XMLTYPE handling during CDC streaming.


  1. Debezium Connector Configuration


Connector name: oracle-debezium-account

{
"name": "oracle-debezium-account",
"connector.class": "io.debezium.connector.oracle.OracleConnector",
"tasks.max": "1",
"database.hostname": "",
"database.port": "1521",
"database.user": "",
"database.password": "",
"database.dbname": "",
"database.connection.adapter": "logminer",
"snapshot.mode": "initial",
"snapshot.locking.mode": "none",
"snapshot.fetch.size": "1000",
"snapshot.max.threads": "1",
"lob.enabled": "true",
"table.include.list": "TAFJ.FBNK_ACCOUNT",
"column.include.list": "TAFJ.FBNK_ACCOUNT.RECID,TAFJ.FBNK_ACCOUNT.XMLRECORD",
"topic.prefix": "",
"schema.history.internal.kafka.bootstrap.servers": "",
"schema.history.internal.kafka.topic": "schema-changes.oracle-ac",
"schema.history.internal.store.only.captured.tables.ddl": "true",
"log.mining.buffer.size": "10485760",
"errors.tolerance": "none"
}

Connector plugin version:
Debezium Oracle Connector 3.4.0.Final

Kafka Connect version:
Apache Kafka 3.9.1


  1. Table DDL


Schema: TAFJ
Table: FBNK_ACCOUNT

CREATE TABLE "TAFJ"."FBNK_ACCOUNT"
(
"RECID" VARCHAR2(255) NOT NULL,
"XMLRECORD" SYS.XMLTYPE,
PRIMARY KEY ("RECID")
)
XMLTYPE COLUMN "XMLRECORD" STORE AS SECUREFILE CLOB
(
ENABLE STORAGE IN ROW
)
PARTITION BY HASH ("RECID");

Supplemental logging is enabled:

SUPPLEMENTAL LOG DATA (ALL) COLUMNS


  1. SQL Statements Executed by the Source Application


The following SQL statements were captured from the Oracle shared SQL area during normal transactions.

Insert:

INSERT INTO FBNK_ACCOUNT(XMLRECORD, RECID)
VALUES (:1, :2)

Update:

UPDATE FBNK_ACCOUNT
SET XMLRECORD = :1
WHERE RECID = :2

Delete (during authorization workflow):

DELETE FROM FBNK_ACCOUNT#NAU
WHERE RECID IN (:1)

Example RECID observed:

7000001

The XML payload is passed as bind variable (:1) and stored in the XMLRECORD XMLTYPE column.


  1. Observed LogMiner Redo SQL (Failure Case)


During streaming, Debezium processes the following SQL reconstructed from redo logs:

update "TAFJ"."FBNK_ACCOUNT" a
set a."XMLRECORD" = XMLType(:1)

where a."RECID" = '7000001';

Connector failure:

io.debezium.text.ParsingException: Parsing failed for SQL

Failed to locate preamble: XML DOC BEGIN


The snapshot phase completes successfully and records are captured correctly.

The failure occurs only during streaming when updates to the XMLTYPE column are processed from LogMiner redo logs.

Please let me know if additional information is required.

Best regards
Zelalem Solomon

Chris Cranford

unread,
Mar 17, 2026, 6:58:20 AMMar 17
to debe...@googlegroups.com
Hi -

Thanks for the steps, I've recorded them in the upstream GitHub issue [1]. We'll take a look and see what is needed to fix this. Right now the issue is you're storing the XML document as CLOB rather than Binary/BLOB, and that is why the ingestion fails, as the LogMiner event payloads are unexpected.

Thanks,
-cc

[1]: https://github.com/debezium/dbz/issues/1373
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/debezium/8cb934a0-0b63-4188-8594-29519f53a174n%40googlegroups.com.

Zelalem Solomon

unread,
Mar 21, 2026, 6:48:30 AM (11 days ago) Mar 21
to debezium

Hi Chris,

Thank you for quickly providing solutions and giving a prioritized attention to  the issues.

I have tested the fix using the latest Debezium 3.4.3-SNAPSHOT (Oracle Connector) which I have built from the source, but unfortunately, I am still encountering a ParsingException during the streaming phase.

Environment Details:

  • Debezium Version: 3.4.3-SNAPSHOT (verified via /connector-plugins endpoint)

  • Kafka Connect: 3.9.1

  • Oracle Version: 12c (On-prem)

  • Storage: XMLTYPE stored as SECUREFILE CLOB

Observed Error: The connector fails as soon as an update is processed on the FBNK_ACCOUNT table. The stack trace indicates that the parser is still looking for a preamble that it cannot find:

Plaintext
Caused by: io.debezium.text.ParsingException: Parsing failed for SQL: 'update "R20MASTER"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '7000050346124';' at io.debezium.connector.oracle.logminer.parser.AbstractSingleColumnSqlRedoPreambleParser.parse(AbstractSingleColumnSqlRedoPreambleParser.java:70) ... Caused by: java.lang.IllegalStateException: Failed to locate preamble: XML DOC BEGIN: at io.debezium.connector.oracle.logminer.parser.AbstractSelectSingleColumnSqlRedoPreambleParser.parseInternal(AbstractSelectSingleColumnSqlRedoPreambleParser.java:33)

SQL captured from V$SQL during the failure: The application is executing the following: UPDATE FBNK_ACCOUNT SET XMLRECORD=:1 WHERE RECID = :2

However, LogMiner reconstructs it in the redo logs with an alias and explicit XMLType constructor: update "R20MASTER"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '7000050346124';

It appears that AbstractSelectSingleColumnSqlRedoPreambleParser is still expecting the "Binary" style preamble (XML DOC BEGIN:) and isn't successfully pivoting to the new CLOB-based logic for this specific SQL string.

I have verified that all jars in my plugin directory are updated to the 3.4.3-SNAPSHOT version. Please let me know if there are specific log categories I should enable or if more details on the redo log contents are needed.

I have attached the errors and configuration details and Oracle database SQL executions.

Best regards,

 Zelalem

Debezium Errors and configs - 3.4.3-SNAPSHOT.txt

Chris Cranford

unread,
Mar 23, 2026, 6:38:13 AM (9 days ago) Mar 23
to debe...@googlegroups.com
Hi Zelalem, can you please provide the commit sha you built the connector based on?

It would also be useful if you supplied the full stack trace of the exception, not just the first few lines. You can obtain this from the connector log.

Thanks,
-cc


On 3/21/26 6:48 AM, Zelalem Solomon wrote:

Hi Chris,

Thank you for quickly providing solutions and giving a prioritized attention to  the issues.

I have tested the fix using the latest Debezium 3.4.3-SNAPSHOT (Oracle Connector) which I have built from the source, but unfortunately, I am still encountering a ParsingException during the streaming phase.

Environment Details:

  • Debezium Version: 3.4.3-SNAPSHOT (verified via /connector-plugins endpoint)

  • Kafka Connect: 3.9.1

  • Oracle Version: 12c (On-prem)

  • Storage: XMLTYPE stored as SECUREFILE CLOB

Observed Error: The connector fails as soon as an update is processed on the FBNK_ACCOUNT table. The stack trace indicates that the parser is still looking for a preamble that it cannot find:

Plaintext
Caused by: io.debezium.text.ParsingException: Parsing failed for SQL: 'update "R20MASTER"."FBNK_ACCOUNT" a set a."XMLRECORD" = XMLType(:1) where a."RECID" = '7000050346124';' at io.debezium.connector.oracle.logminer.parser.AbstractSingleColumnSqlRedoPreambleParser.parse(AbstractSingleColumnSqlRedoPreambleParser.java:70)... Caused by: java.lang.IllegalStateException: Failed to locate preamble: XML DOC BEGIN: at io.debezium.connector.oracle.logminer.parser.AbstractSelectSingleColumnSqlRedoPreambleParser.parseInternal(AbstractSelectSingleColumnSqlRedoPreambleParser.java:33)

Chris Cranford

unread,
Mar 23, 2026, 6:42:21 AM (9 days ago) Mar 23
to debe...@googlegroups.com
Hi Zelalem, my apologies, I missed the attachment on the email.

I checked the stack trace in the attachment, and you built the existing 3.4 upstream branch, but the change you wanted to check is still in a PR, unmerged but will be merged today.
The PR - https://github.com/debezium/debezium/pull/7189

So if you wanted to test this locally, you'd need to build from that PR.

Hope that helps.
-cc

Zelalem Solomon

unread,
Mar 23, 2026, 9:46:42 AM (9 days ago) Mar 23
to debezium

Hi Chris,

Thank you again for the quick turnaround and for pointing me to the correct PR.

I’ve now rebuilt the connector using the PR branch (pull/7189, locally checked out as pr-7189) and performed a preliminary test on our environment (Oracle 12c with XMLTYPE stored as SECUREFILE CLOB, LogMiner-based CDC).

I’m happy to report that the initial results are positive — the connector is now able to process the update events without encountering the previous ParsingException, and the XML data appears to be handled correctly.

This is very encouraging and directly addresses our use case.

I’ll continue with more thorough and extended testing (including different workloads and edge cases) and will share any further findings if I encounter issues.

Thanks again for your support and for the excellent work on this fix — much appreciated.

Best regards,
Zelalem

Reply all
Reply to author
Forward
0 new messages