Oracle connector and NUMBER type challenge

17 views
Skip to first unread message

Fasil Malik Hayat

unread,
Jan 31, 2026, 4:07:26 AM (3 days ago) Jan 31
to debezium

I am going to face this problem with debezium eventually, but I really hope someone here can help me with decoding the NUMBER type from Oracle, which I am getting with a JDBC connector onto Kafka as bytes and I am having a hard time decoding it. I have tried help from chat gpt but it is making me go in circles. I am using Python and C# to dedode it without any succes. So I am not sure what I am missing.


Here is my scenario.


Oracle column datatype:

MY_FACTOR NUMBER

  

When selecting the the column value in Oracle select query:

SELECT MY_FACTOR FROM MY_SCHEMA.MY_CALC_RESULTS;

Result:

MY_FACTOR: 1

 

# My JDBC connector settings

$connectorJson = @"

{

  "name": "oracle-calc-connector",

  "config": {

    "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",

    "tasks.max": "1",

    "connection.url": "jdbc:oracle:thin:@//oracle-db:1521/FREEPDB1",

    "connection.user": "MYUSER",

    "connection.password": "mypassword",

    "table.types": "TABLE",

    "table.include.list": "MY_SCHEMA.MY_CALC_RESULTS",

    "mode": "timestamp+incrementing",

    "timestamp.columns.mapping": "MY_SCHEMA.MY_CALC_RESULTS:[DATA_LOAD_TS]",

    "incrementing.column.mapping": "MY_SCHEMA.MY_CALC_RESULTS:PORTFOLIO_RESULTS_KEY",

    "db.timezone": "UTC",

    "poll.interval.ms": "5000",

    "table.poll.interval.ms": "5000",

    "numeric.mapping": "precision_only", # I have also tried with ‘none’ and ‘best_fit’ without any success

    "validate.non.null": "false",

    "topic.prefix": "oracle.calc.",

    "batch.max.rows": "10000",

    "producer.override.compression.type": "snappy"

  }

}

"@

 

 

Kafka topic schema in the topic payload:

{

"type": "bytes",

"optional": true,

"name": "org.apache.kafka.connect.data.Decimal",

"version": 1,

"parameters": {

  "scale": "127",

  "connect.decimal.precision": "0"

},

"field": "MY_FACTOR"

},

                       

Value of MY_FACTOR in the topic payload:

"MY_FACTOR": "OxdP6jOQnr/o+UfE4q0zr5p7lMvK0Fh9N+fohbLmyY9Nt9u5ZoAAAAAAAAAAAAAAAAAAAAA=",

 

How do I decode the encoded value of MY_FACTOR 'OxdP6jOQnr/o+UfE4q0zr5p7lMvK0Fh9N+fohbLmyY9Nt9u5ZoAAAAAAAAAAAAAAAAAAAAA='? as it comes out in bytes? Somewhere my value 1 is in there.

I have tried with C# and Python, but I don't know how to get the value out.

I get NULL value out or something very far from 1.

Can someone help?

Fasil Malik Hayat

unread,
Jan 31, 2026, 4:21:31 AM (3 days ago) Jan 31
to debezium
Having done some more research without testing seems like I need to re-configure my JDBC connector with this:
numeric.mapping=best_fit numeric.precision.mapping=true

Chris Cranford

unread,
Jan 31, 2026, 9:09:57 AM (3 days ago) Jan 31
to debe...@googlegroups.com
Hi -

The "NUMBER" data type is a semantic alias for "NUMERIC(38, 0)"; an Oracle numeric column that stores a value up to 38 digits in length. The reason why Debezium's Oracle connector serializes any numeric column that has a length of 19 or greater is because those values exceed Long.MAX_VALUE and Double.MAX_VALUE, which is the upper limit to storing a whole number (INT64) or real number (FLOAT64) in a Kafka event field. So for numeric columns that have such lengths, Debezium represents these as a VariableScaleDecimal logical type, which is serializes as a series of bytes to avoid precision loss.

Debezium does provide a global configuration option to control how numeric/decimal values are serialized called `decimal.handling.mode` [1]. By default it uses `precise`, which avoids the precision loss that I described above; however, it can also be set to `string` or `double` to encode the value as a character string FLOAT64 respectively.

When set to `string`, you won't have precision loss; however, it can be difficult to manipulate when the value exceeds the 64-bit boundary unless the target system is explicitly designed to handle numeric values that are greater than 64-bits. When set to `double`, there is the possibility of precision loss as you're limited to the upper boundary of the value. This is often easier to manipulate as the value can be represented in a Double or Long, but care must be given to avoid data loss if the value exceeds 18 digits.

However, please be aware that changing `decimal.handling.mode` is global and it changes how all numeric columns are emitted for that connector. Should you need granular control to change only a subset of columns, a `CustomConverter`[2]  is highly useful in this scenario as you can supply a list via configuration of the table/columns that should be converted differently. The `CustomConverter` replaces how Debezium generates the schema and value representation, giving you full control should any of the existing decimal handling modes be insufficient.

Let us know should you have further questions about the Oracle source.

Thanks,
-cc

[1]: https://debezium.io/documentation/reference/stable/connectors/oracle.html#oracle-property-decimal-handling-mode
[2]: https://debezium.io/documentation/reference/stable/development/converters.html
--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/debezium/bdbf2504-ab9e-4aab-99e2-b16b6e0c20b0n%40googlegroups.com.

Fasil Malik Hayat

unread,
Jan 31, 2026, 1:34:02 PM (3 days ago) Jan 31
to debezium
Thanks Chris. Your response was very helpful for understatnding the inner workings of Debezium with Kafka. 
I actually made it work, where it was all about decoding the bytes of the values to something I could map into a C# or Python variable, where the precision is intact.

I got it working. Thanks for the switf reply.
Reply all
Reply to author
Forward
0 new messages