MySql decimal type converts to Avro byte array

1,192 views
Skip to first unread message

Leon

unread,
Apr 24, 2016, 10:26:09 PM4/24/16
to Confluent Platform
Hi,

MySQL decimal type is converted to avro bytes type. How is the decimal encoded in the byte array? I want to convert the byte array back to original decimal value in JavaScript so I need to know how it is encoded.
Following is the avro type I memtioned:

{"type":"bytes","connect.version":1,"connect.parameters":{"scale":"0"},"connect.name":"org.apache.kafka.connect.data.Decimal"}

Leon

Liquan Pei

unread,
Apr 24, 2016, 11:39:58 PM4/24/16
to confluent...@googlegroups.com
Hi Leon,

JDBC's decimal and numeric types are converted to byte array as there is no java type can keep the precision.  The scale parameter in 'connnect.parameters' encodes the scale of the decimal type and the byte array value encodes the unscaled part. 

Basically, you can retrieve the scale from the Avro schema and convert back to original decimal value with the byte array value. 
In java,  you can use 
new BigDecimal(new BigInteger(value), scale) 
to convert back to the original value. 

Thanks,
Liquan

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platf...@googlegroups.com.
To post to this group, send email to confluent...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/confluent-platform/be66cc60-dcf8-46d3-9840-7a20045f5cce%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Liquan Pei | Software Engineer | Confluent | +1 413.230.6855
Download Apache Kafka and Confluent Platform: www.confluent.io/download

Remo Suurkivi

unread,
Jun 9, 2016, 8:49:45 AM6/9/16
to Confluent Platform
Hi,

Are there any good examples how to get a BigDecimal value out of ByteArray, with both Json and Avro payloads. Starting from Consumer side, how to define Deserializers, formatters, and converters when needed. this information is heavily spread around and it's quite complicated to figure out how this works. Given example does not work when using Byte[] as input at the moment.
It would be quite disappointing to not to use Kafka just because of some conversion issues. 

Example results currently being delivered by Json is "AA==", should be 0.00 or from Avro "\u0000". With timestamps it's quite easy to use already existing functions to convert back to human readable format, but this is just not clear. Easiest way would be of course to be able to enforce conversion as a property, no need for after processing or custom code.

Any help appreciated!

Remo


On Monday, April 25, 2016 at 6:39:58 AM UTC+3, Liquan Pei wrote:
Hi Leon,

JDBC's decimal and numeric types are converted to byte array as there is no java type can keep the precision.  The scale parameter in 'connnect.parameters' encodes the scale of the decimal type and the byte array value encodes the unscaled part. 

Basically, you can retrieve the scale from the Avro schema and convert back to original decimal value with the byte array value. 
In java,  you can use 
new BigDecimal(new BigInteger(value), scale) 
to convert back to the original value. 

Thanks,
Liquan

On Sun, Apr 24, 2016 at 7:26 PM, Leon <leel...@huawei.com> wrote:
Hi,

MySQL decimal type is converted to avro bytes type. How is the decimal encoded in the byte array? I want to convert the byte array back to original decimal value in JavaScript so I need to know how it is encoded.
Following is the avro type I memtioned:

{"type":"bytes","connect.version":1,"connect.parameters":{"scale":"0"},"connect.name":"org.apache.kafka.connect.data.Decimal"}

Leon

-- 

gerard...@dizzit.com

unread,
Jun 9, 2016, 9:51:10 AM6/9/16
to Confluent Platform
The reason there aren't any good examples, is because there is not one way of doing it, since there is no primitive type in avro which supports precision.
If your scale is 0, you might be able to switch to using long in your schema, else you have to write some code to transform the bytes back to something usefull.

Remo Suurkivi

unread,
Jun 9, 2016, 2:14:34 PM6/9/16
to Confluent Platform
Still, that can't be the end of it. How to get even to Long initially. Is it possible to use somehow existing libraries properly e.g. org.apache.kafka.connect.data.* or something. Any working example would help quite a lot, not a programmer and this simple issue is quite a roadblock.

Ewen Cheslack-Postava

unread,
Jun 9, 2016, 10:03:04 PM6/9/16
to Confluent Platform
The Decimal class in Kafka Connect has to do this conversion: https://github.com/apache/kafka/blob/trunk/connect/api/src/main/java/org/apache/kafka/connect/data/Decimal.java#L73 But note that it is working with the data that was already converted to a byte[]. You would still need to handle the step from JSON's base64 encoding or Avro's unicode encoding.

-Ewen

--
You received this message because you are subscribed to the Google Groups "Confluent Platform" group.
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platf...@googlegroups.com.
To post to this group, send email to confluent...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
Thanks,
Ewen

Shelwin Wei

unread,
Aug 7, 2018, 10:45:26 AM8/7/18
to Confluent Platform
Hi Even,

I'm also suffering from this issue of encoding:

Below is the schema of field "iQuotedPrice", I use JSONObject to store the record.value() after calling poll() method but I get a mess in field "iQuotedPrice".
Could you please let me know how to fix this issue or how to use Avro's unicode decoding? Many thanks in advance! I configured value.serializer with "io.confluent.kafka.serializers.KafkaAvroSerializer" when loading JDBC connector.




Best Regards,
Shelwin Wei

在 2016年6月10日星期五 UTC+8上午10:03:04,Ewen Cheslack-Postava写道:
To unsubscribe from this group and stop receiving emails from it, send an email to confluent-platform+unsub...@googlegroups.com.

To post to this group, send email to confluent...@googlegroups.com.



--
Thanks,
Ewen

Reply all
Reply to author
Forward
0 new messages