[ORACLE] Adding column leads to "DML statement couldn't be parsed"

39 views
Skip to first unread message

Johann Brandl

unread,
Jul 29, 2025, 6:43:34 AMJul 29
to debezium
Hi everyone,

i've got an issue with a table that got an additional Column this morning. 
After that the following error occured: 
---------------------------------------------------------------------------------------------------------------------------------------------
org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.\n\tat io.debezium.pipeline.ErrorHandler.setProducerThrowable(ErrorHandler.java:67)\n\tat io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:264)\n\tat io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:62)\n\tat io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:324)\n\tat io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:203)\n\tat io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:143)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:840)\nCaused by: io.debezium.DebeziumException: Oracle failed to re-construct redo SQL 'update \"IFADMIN\".\"UNITS\" set \"COL 2\" = HEXTORAW('c102'), \"COL 3\" = HEXTORAW('787d071d083503'), \"COL 4\" = HEXTORAW('c102'), \"COL 5\" = HEXTORAW('3e6466'), \"COL 6\" = HEXTORAW('3031323032353036313631323236303930303030303133383530323938'), \"COL 7\" = NULL, \"COL 8\" = NULL, \"COL 9\" = HEXTORAW('c504400b475b'), \"COL 10\" = NULL, \"COL 11\" = HEXTORAW('c6025c1d074002'), \"COL 12\" = HEXTORAW('c6025e4a404502'), \"COL 13\" = HEXTORAW('c104'), \"COL 14\" = HEXTORAW('787d0610063b1c'), \"COL 15\" = NULL, \"COL 16\" = NULL, \"COL 17\" = HEXTORAW('c504400b475b'), \"COL 18\" = HEXTORAW('37'), \"COL 19\" = NULL, \"COL 20\" = HEXTORAW('c6025c1d074002'), \"COL 21\" = HEXTORAW('37') where \"COL 1\" = HEXTORAW('c6025c1d074702') and \"COL 2\" = HEXTORAW('c102') and \"COL 3\" = HEXTORAW('787d061007052b') and \"COL 4\" = HEXTORAW('c102') and \"COL 5\" = HEXTORAW('3e6466') and \"COL 6\" = HEXTORAW('3031323032353036313631323236303930303030303133383530323938') and \"COL 7\" IS NULL and \"COL 8\" IS NULL and \"COL 9\" = HEXTORAW('c504400b475b') and \"COL 10\" IS NULL and \"COL 11\" = HEXTORAW('c6025c1d074002') and \"COL 12\" IS NULL and \"COL 13\" = HEXTORAW('c104') and \"COL 14\" = HEXTORAW('787d0610063b1c') and \"COL 15\" IS NULL and \"COL 16\" IS NULL and \"COL 17\" = HEXTORAW('c504400b475b') and \"COL 18\" = HEXTORAW('37') and \"COL 19\" IS NULL and \"COL 20\" = HEXTORAW('c6025c1d074002') and \"COL 21\" = HEXTORAW('37');'\n\tat io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:1312)\n\tat io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:536)\n\tat io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:442)\n\tat io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:290)\n\tat io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:243)\n\t... 9 more\n.
---------------------------------------------------------------------------------------------------------------------------------------------

After some reading, i decided to change log.mining.strategy from online_catalog to hybrid and schema.mode.adjustment.mode: avro.

After some trials and errors the cdc for this table is still not working.
Now i get the following errors:

---------------------------------------------------------------------------------------------------------------------------------------------

2025-07-29 09:30:14,822 ERROR [zollnergroup-mes-cons|task-0] Mining session stopped due to error. (io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource) [debezium-oracleconnector-zollnergroup.mes-change-event-source-coordinator]
io.debezium.connector.oracle.logminer.parser.DmlParserException: DML statement couldn't be parsed. Please open a Jira issue with the statement 'insert into "IFADMIN"."UNITS"("IDENT","DB_ID","TSTAMP","STATES_ID","STATES_ID_CTR","BARCODE","NAME","DESCRIPTION","ORDERS_ID","JOINED_SPLIT","PREDECESSOR_ID","SUCCESSOR_ID","UNIT_STATUS_ID","START_TIME","END_TIME","VIEW_GRP_ID","EXT_ID","SEQUENCE","STATUSCODES_ID","PERMANENT_PREDECESSOR_ID","PERMANENT_SEQUENCE","UUID") values ('19375667701','1',TO_DATE('2025-07-29 11:29:48', 'YYYY-MM-DD HH24:MI:SS'),'1','-1','01202507291365370000013886364',NULL,NULL,'367000290',NULL,NULL,NULL,'3',TO_DATE('2025-07-29 11:29:42', 'YYYY-MM-DD HH24:MI:SS'),NULL,NULL,'367000290',NULL,NULL,NULL,NULL,NULL);'.
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:1686)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.lambda$handleDataEvent$28(AbstractLogMinerEventProcessor.java:1353)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.addToTransaction(AbstractLogMinerEventProcessor.java:1564)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:1352)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:536)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:442)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:290)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:243)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:62)
at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:324)
at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:203)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:143)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: io.debezium.connector.oracle.logminer.parser.DmlParserException: Failed to parse insert DML: 'insert into "IFADMIN"."UNITS"("IDENT","DB_ID","TSTAMP","STATES_ID","STATES_ID_CTR","BARCODE","NAME","DESCRIPTION","ORDERS_ID","JOINED_SPLIT","PREDECESSOR_ID","SUCCESSOR_ID","UNIT_STATUS_ID","START_TIME","END_TIME","VIEW_GRP_ID","EXT_ID","SEQUENCE","STATUSCODES_ID","PERMANENT_PREDECESSOR_ID","PERMANENT_SEQUENCE","UUID") values ('19375667701','1',TO_DATE('2025-07-29 11:29:48', 'YYYY-MM-DD HH24:MI:SS'),'1','-1','01202507291365370000013886364',NULL,NULL,'367000290',NULL,NULL,NULL,'3',TO_DATE('2025-07-29 11:29:42', 'YYYY-MM-DD HH24:MI:SS'),NULL,NULL,'367000290',NULL,NULL,NULL,NULL,NULL);'
at io.debezium.connector.oracle.logminer.parser.LogMinerDmlParser.parseInsert(LogMinerDmlParser.java:127)
at io.debezium.connector.oracle.logminer.parser.LogMinerDmlParser.parse(LogMinerDmlParser.java:80)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:1680)
... 16 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
2025-07-29 09:30:14,823 ERROR [zollnergroup-mes-cons|task-0] Producer failure (io.debezium.pipeline.ErrorHandler) [debezium-oracleconnector-zollnergroup.mes-change-event-source-coordinator]
io.debezium.connector.oracle.logminer.parser.DmlParserException: DML statement couldn't be parsed. Please open a Jira issue with the statement 'insert into "IFADMIN"."UNITS"("IDENT","DB_ID","TSTAMP","STATES_ID","STATES_ID_CTR","BARCODE","NAME","DESCRIPTION","ORDERS_ID","JOINED_SPLIT","PREDECESSOR_ID","SUCCESSOR_ID","UNIT_STATUS_ID","START_TIME","END_TIME","VIEW_GRP_ID","EXT_ID","SEQUENCE","STATUSCODES_ID","PERMANENT_PREDECESSOR_ID","PERMANENT_SEQUENCE","UUID") values ('19375667701','1',TO_DATE('2025-07-29 11:29:48', 'YYYY-MM-DD HH24:MI:SS'),'1','-1','01202507291365370000013886364',NULL,NULL,'367000290',NULL,NULL,NULL,'3',TO_DATE('2025-07-29 11:29:42', 'YYYY-MM-DD HH24:MI:SS'),NULL,NULL,'367000290',NULL,NULL,NULL,NULL,NULL);'.
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:1686)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.lambda$handleDataEvent$28(AbstractLogMinerEventProcessor.java:1353)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.addToTransaction(AbstractLogMinerEventProcessor.java:1564)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.handleDataEvent(AbstractLogMinerEventProcessor.java:1352)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processRow(AbstractLogMinerEventProcessor.java:536)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.processResults(AbstractLogMinerEventProcessor.java:442)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.process(AbstractLogMinerEventProcessor.java:290)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:243)
at io.debezium.connector.oracle.logminer.LogMinerStreamingChangeEventSource.execute(LogMinerStreamingChangeEventSource.java:62)
at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:324)
at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:203)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:143)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: io.debezium.connector.oracle.logminer.parser.DmlParserException: Failed to parse insert DML: 'insert into "IFADMIN"."UNITS"("IDENT","DB_ID","TSTAMP","STATES_ID","STATES_ID_CTR","BARCODE","NAME","DESCRIPTION","ORDERS_ID","JOINED_SPLIT","PREDECESSOR_ID","SUCCESSOR_ID","UNIT_STATUS_ID","START_TIME","END_TIME","VIEW_GRP_ID","EXT_ID","SEQUENCE","STATUSCODES_ID","PERMANENT_PREDECESSOR_ID","PERMANENT_SEQUENCE","UUID") values ('19375667701','1',TO_DATE('2025-07-29 11:29:48', 'YYYY-MM-DD HH24:MI:SS'),'1','-1','01202507291365370000013886364',NULL,NULL,'367000290',NULL,NULL,NULL,'3',TO_DATE('2025-07-29 11:29:42', 'YYYY-MM-DD HH24:MI:SS'),NULL,NULL,'367000290',NULL,NULL,NULL,NULL,NULL);'
at io.debezium.connector.oracle.logminer.parser.LogMinerDmlParser.parseInsert(LogMinerDmlParser.java:127)
at io.debezium.connector.oracle.logminer.parser.LogMinerDmlParser.parse(LogMinerDmlParser.java:80)
at io.debezium.connector.oracle.logminer.processor.AbstractLogMinerEventProcessor.parseDmlStatement(AbstractLogMinerEventProcessor.java:1680)
... 16 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
---------------------------------------------------------------------------------------------------------------------------------------------

I tried with event.processing.failure.handling.mode: warn.
With this setting, I can see that updates are being sent to kafka. Inserts, on the other hand, cause the error.
Interesting: Updates contain 21 columns, Inserts contain 22 columns.
I tried to manually correct the schema in the schema registry, but that also didn't work.

I am grateful for any help

Best regards
jb



Chris Cranford

unread,
Jul 29, 2025, 7:53:49 AMJul 29
to debe...@googlegroups.com
Hi -

Can you please check the table "IFADMIN"."UNITS" and confirm the following:

    - The column_id / segment_column_id are not desync'd - see DBZ-7835 [1]
    - The table does not have any virtual, generated always columns

If neither of those are true, can you please provide the DDL for this table and its previous DDL before the schema change that caused the reconstruction failure with "COL x" columns when using `online_catalog`.

Thanks,
-cc

[1]: https://issues.redhat.com/browse/DBZ-7835
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/debezium/4d5ff7b7-5f1f-4841-862b-576d302d6548n%40googlegroups.com.

Johann Brandl

unread,
Jul 29, 2025, 10:04:30 AMJul 29
to debezium
I could fix the issue,

don't know whether it's the best approach but it worked.

- I put the table to the exclude list of the connector
- after that i copied the connector and configured just for this one table with snapshot.mode: schema_only_recovery.

It took a while, then the topic updated.
Then i deleted the additional connector and added the table again to the original connector.
This then resumed processing.

I'm not sure if I've lost any data. Therefore, I'll synchronize the data for this table and the time period via Signal.

Best regards
jb

Johann Brandl

unread,
Jul 29, 2025, 10:38:35 AMJul 29
to debezium
Hi Chris,

column_id and segment_column_id are in sync (checked it according to DBZ-7835). 
There's also no virtual column.

I just got the actual create statement for the table:

CREATE TABLE "IFADMIN"."UNITS"(
    "IDENT"                    NUMBER(19,0)
        NOT NULL ENABLE,
    "DB_ID"                    NUMBER(19,0)DEFAULT 90
        NOT NULL ENABLE,
    "TSTAMP"                   DATE,
    "STATES_ID"                NUMBER(19,0)
        NOT NULL ENABLE,
    "STATES_ID_CTR"            NUMBER(19,0)DEFAULT - 1
        NOT NULL ENABLE,
    "BARCODE"                  VARCHAR2(50 BYTE)
        NOT NULL ENABLE,
    "NAME"                     VARCHAR2(50 BYTE),
    "DESCRIPTION"              VARCHAR2(255 BYTE),
    "ORDERS_ID"                NUMBER(19,0)
        NOT NULL ENABLE,
    "JOINED_SPLIT"             NUMBER(5,0),
    "PREDECESSOR_ID"           NUMBER(19,0),
    "SUCCESSOR_ID"             NUMBER(19,0),
    "UNIT_STATUS_ID"           NUMBER(19,0),
    "START_TIME"               DATE,
    "END_TIME"                 DATE,
    "VIEW_GRP_ID"              NUMBER(19,0),
    "EXT_ID"                   NUMBER(19,0),
    "SEQUENCE"                 VARCHAR2(50 BYTE),
    "STATUSCODES_ID"           NUMBER(19,0),
    "PERMANENT_PREDECESSOR_ID" NUMBER(19,0),
    "PERMANENT_SEQUENCE"       VARCHAR2(50 BYTE),
    "UUID"                     VARCHAR2(50 BYTE),
    CONSTRAINT "PK_UNITS" PRIMARY KEY("IDENT")
        USING INDEX enable,
    SUPPLEMENTAL LOG DATA(ALL)COLUMNS,
    CONSTRAINT "UNITS_UNITS_FK1" FOREIGN KEY("PREDECESSOR_ID")
        REFERENCES "IFADMIN"."UNITS"("IDENT")
    DEFERRABLE INITIALLY DEFERRED ENABLE,
    CONSTRAINT "UNITS_UNITS_FK2" FOREIGN KEY("SUCCESSOR_ID")
        REFERENCES "IFADMIN"."UNITS"("IDENT")
    DEFERRABLE INITIALLY DEFERRED ENABLE,
    CONSTRAINT "ORDERS_UNITS_FK1" FOREIGN KEY("ORDERS_ID")
        REFERENCES "IFADMIN"."ORDERS"("IDENT")
    DEFERRABLE INITIALLY DEFERRED ENABLE,
    CONSTRAINT "STATES_UNITS_FK1" FOREIGN KEY("STATES_ID")
        REFERENCES "IFADMIN"."STATES"("IDENT")
    DEFERRABLE INITIALLY DEFERRED ENABLE,
    CONSTRAINT "UNIT_STATUS_UNITS_FK1" FOREIGN KEY("UNIT_STATUS_ID")
        REFERENCES "IFADMIN"."UNIT_STATUS"("IDENT")
    DEFERRABLE INITIALLY DEFERRED ENABLE
);

As I wrote in my last post, the transfer is now working. I suspect it was a mismatch between the number of columns in the schema.
Best regards
jb

Chris Cranford

unread,
Jul 29, 2025, 11:27:16 AMJul 29
to debe...@googlegroups.com
Hi Jb,

It was definitely a mismatch, I just wanted to rule out whether it was just that a DDL change may have been missed, perhaps the DBA executed it as the SYS user or whether it was something more concerning.
I am glad it is now working.

Thanks,
-cc
Reply all
Reply to author
Forward
0 new messages