Invalid data for mysql bigint.

162 views
Skip to first unread message

saty...@telmate.com

unread,
Oct 24, 2017, 8:14:47 PM10/24/17
to debezium
Hi All,

Did any one come across data being converted , esp mysql bigint, to some invalid characters like,

AA== 
CvrTxQ==
CYLH1w== 

i found a similar situation here, _https://github.com/confluentinc/kafka-connect-jdbc/issues/33 , but have no clue how to fix it.
Any suggestion or help is appreciated.

Regards,
Satyajit.  

Gunnar Morling

unread,
Oct 25, 2017, 3:03:53 AM10/25/17
to debe...@googlegroups.com
Hi,

Is this a BIGINT *UNSIGNED* column by any chance? If so, it will be transmitted using Kafka Connect's Decimal type, which is using a byte array as its internal representation.

In 0.6.1 (due in the next few days), there'll be a new setting "bigint.unsigned.handling.mode" which allows you to transmit them as long optionally which is more practical in most cases. That mode shouldn't be used though if your column contains values larger than 2^63 as they cannot be conveyed via long.

--Gunnar



--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+unsubscribe@googlegroups.com.
To post to this group, send email to debe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/debezium/ca4b23c4-c98e-4013-a2bd-1df8c729f03a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

saty...@telmate.com

unread,
Oct 25, 2017, 2:08:18 PM10/25/17
to debezium
Thank you for your quick response Gunnar.
Yeah its *UNSIGNED* column.
I took the upstream/0.6 branch and built it for testing, which seems to be working.
Regards,
Satyajit.

On Wednesday, October 25, 2017 at 12:03:53 AM UTC-7, Gunnar Morling wrote:
Hi,

Is this a BIGINT *UNSIGNED* column by any chance? If so, it will be transmitted using Kafka Connect's Decimal type, which is using a byte array as its internal representation.

In 0.6.1 (due in the next few days), there'll be a new setting "bigint.unsigned.handling.mode" which allows you to transmit them as long optionally which is more practical in most cases. That mode shouldn't be used though if your column contains values larger than 2^63 as they cannot be conveyed via long.

--Gunnar


2017-10-25 2:14 GMT+02:00 <saty...@telmate.com>:
Hi All,

Did any one come across data being converted , esp mysql bigint, to some invalid characters like,

AA== 
CvrTxQ==
CYLH1w== 

i found a similar situation here, _https://github.com/confluentinc/kafka-connect-jdbc/issues/33 , but have no clue how to fix it.
Any suggestion or help is appreciated.

Regards,
Satyajit.  

--
You received this message because you are subscribed to the Google Groups "debezium" group.
To unsubscribe from this group and stop receiving emails from it, send an email to debezium+u...@googlegroups.com.

saty...@telmate.com

unread,
Nov 2, 2017, 7:22:33 PM11/2/17
to debezium
Hi Gunnar,

I am having a similar problem with blob datatypes as well.
Is there something that we could do about correcting it?

Regards,
Satyajit.

Jiri Pechanec

unread,
Nov 3, 2017, 2:56:24 AM11/3/17
to debezium
Hi,

this works as expected. BLOB is converted into geric BYTES datatype. I suppose from the postred exmaple that it is Base64 encoded string so you should process it as that.

J.
Reply all
Reply to author
Forward
0 new messages