Portable serialization using Avro

123 views
Skip to first unread message

fj.pi...@indizen.com

unread,
Nov 6, 2017, 10:58:51 AM11/6/17
to Hazelcast
Hi, 

After a deep read of hazelcast documentation. I can not find if my scenario is possible in Hazelcast.

I am working with Avro using Kafka, schema registry and kafka streams saving the data in Hazelcast. I want to support portable serialization, because the goal of schema registry is perform the capability of schema evolutions and in my road map is present to add new fields to my schema on a regular basic and all the information saved in Hazelcast must be compatible.

Your documentation talks about PortableSerialization, Implementing Portable class in the POJO, but Avro Pojo is created by a avro schema and you can not implement Portable class.


is there a way to work with portable serialization using Avro in Hazelcast?. I could save more than one avro schemas in the same iMap, but when I need to red a older object, The serializerConfig have the newest version of the Avro.

As workaround, I tried to modify the serializer each time I go to query Hazelcast, but it is not possible

db.getConfig().setSerializationConfig(serializationConfig);

Exception in thread "main" java.lang.UnsupportedOperationException: Client cannot access cluster config!

Any idea?
Thanks in advance
Regards


fj.pi...@indizen.com

unread,
Nov 7, 2017, 11:22:16 AM11/7/17
to Hazelcast
Solved.. Hazelcast and avro seem that they are not getting well together. So, we store in IMDG avro object in json format and when we need get Hazelcast data we have implemented a interface in order to serialize the json in a avro generic record object.

Now, I have integrated schema registry schemas evolution with avro and hazelcast.

Regards to Neil from Madrid :)
Reply all
Reply to author
Forward
0 new messages