Hazelcast & compression

262 views
Skip to first unread message

vadim....@gmail.com

unread,
Dec 10, 2015, 7:06:59 AM12/10/15
to Hazelcast
Hello everybody,

I have read through documentatio and found that Hazelcast supports out of the box compression for default serialization (of Serializable & Externalizable). I understand that it is explicitly told to be slow and CPU consuming but as a metter of try and worthness for my case in terms of network bandwidth I wanted to give it a try.

I am using Hazelcast's integration with Spring and did the following within by <hz:config/>:

<hz:serialization enable-compression="true"/>

However, after this my nodes can't talk to each other flooding with following exception (version 3.5.4):
SEVERE: [localhost]:5701 [dev] [3.5.4] Failed to process packet: Packet{header=17, isResponse=false, isOperation=true, isEvent=false, partitionId=-1, conn=Connection [/127.0.0.1:5701 -> /127.0.0.1:36714], endpoint=Address[localhost]:5702, live=true, type=MEMBER} on hz._hzInstance_1_dev.generic-operation.thread-1
com.hazelcast.nio.serialization.HazelcastSerializationException: java.io.EOFException: Cannot read 4 bytes!
    at com.hazelcast.nio.serialization.SerializationServiceImpl.handleException(SerializationServiceImpl.java:380)
    at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(SerializationServiceImpl.java:282)
    at com.hazelcast.spi.impl.NodeEngineImpl.toObject(NodeEngineImpl.java:200)
    at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:300)
    at com.hazelcast.spi.impl.operationexecutor.classic.OperationThread.processPacket(OperationThread.java:142)
    at com.hazelcast.spi.impl.operationexecutor.classic.OperationThread.process(OperationThread.java:115)
    at com.hazelcast.spi.impl.operationexecutor.classic.OperationThread.doRun(OperationThread.java:101)
    at com.hazelcast.spi.impl.operationexecutor.classic.OperationThread.run(OperationThread.java:76)
Caused by: java.io.EOFException: Cannot read 4 bytes!
    at com.hazelcast.nio.serialization.ByteArrayObjectDataInput.checkAvailable(ByteArrayObjectDataInput.java:543)
    at com.hazelcast.nio.serialization.ByteArrayObjectDataInput.readInt(ByteArrayObjectDataInput.java:255)
    at com.hazelcast.nio.serialization.ByteArrayObjectDataInput.readInt(ByteArrayObjectDataInput.java:249)
    at com.hazelcast.cluster.impl.ConfigCheck.readData(ConfigCheck.java:217)
    at com.hazelcast.cluster.impl.JoinMessage.readData(JoinMessage.java:80)
    at com.hazelcast.cluster.impl.operations.MasterDiscoveryOperation.readInternal(MasterDiscoveryOperation.java:46)
    at com.hazelcast.spi.Operation.readData(Operation.java:491)
    at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:111)
    at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:39)
    at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAdapter.java:41)
    at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(SerializationServiceImpl.java:276)
    ... 6 more


Moreover, older version (3.4.2) also fails but with completely different exception:
SEVERE: [localhost]:5701 [dev] [3.4.2] java.util.zip.ZipException: invalid stored block lengths
com.hazelcast.nio.serialization.HazelcastSerializationException: java.util.zip.ZipException: invalid stored block lengths
    at com.hazelcast.nio.serialization.SerializationServiceImpl.handleException(SerializationServiceImpl.java:419)
    at com.hazelcast.nio.serialization.SerializationServiceImpl.readObject(SerializationServiceImpl.java:315)
    at com.hazelcast.nio.serialization.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:439)
    at com.hazelcast.cluster.impl.ConfigCheck.readData(ConfigCheck.java:215)
    at com.hazelcast.cluster.impl.JoinMessage.readData(JoinMessage.java:80)
    at com.hazelcast.cluster.impl.operations.MasterDiscoveryOperation.readInternal(MasterDiscoveryOperation.java:46)
    at com.hazelcast.spi.Operation.readData(Operation.java:299)
    at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:111)
    at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:39)
    at com.hazelcast.nio.serialization.StreamSerializerAdapter.toObject(StreamSerializerAdapter.java:65)
    at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(SerializationServiceImpl.java:260)
    at com.hazelcast.spi.impl.NodeEngineImpl.toObject(NodeEngineImpl.java:186)
    at com.hazelcast.spi.impl.BasicOperationService$OperationPacketHandler.loadOperation(BasicOperationService.java:654)
    at com.hazelcast.spi.impl.BasicOperationService$OperationPacketHandler.handle(BasicOperationService.java:637)
    at com.hazelcast.spi.impl.BasicOperationService$OperationPacketHandler.access$1500(BasicOperationService.java:630)
    at com.hazelcast.spi.impl.BasicOperationService$BasicDispatcherImpl.dispatch(BasicOperationService.java:582)
    at com.hazelcast.spi.impl.BasicOperationScheduler$OperationThread.process(BasicOperationScheduler.java:466)
    at com.hazelcast.spi.impl.BasicOperationScheduler$OperationThread.processPriorityMessages(BasicOperationScheduler.java:480)
    at com.hazelcast.spi.impl.BasicOperationScheduler$OperationThread.doRun(BasicOperationScheduler.java:457)
    at com.hazelcast.spi.impl.BasicOperationScheduler$OperationThread.run(BasicOperationScheduler.java:432)
Caused by: java.util.zip.ZipException: invalid stored block lengths
    at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
    at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:116)
    at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2310)
    at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2323)
    at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:2794)
    at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:801)
    at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
    at com.hazelcast.nio.IOUtil$1.<init>(IOUtil.java:111)
    at com.hazelcast.nio.IOUtil.newObjectInputStream(IOUtil.java:111)
    at com.hazelcast.nio.serialization.DefaultSerializers$ObjectSerializer.read(DefaultSerializers.java:188)
    at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
    at com.hazelcast.nio.serialization.SerializationServiceImpl.readObject(SerializationServiceImpl.java:309)
    ... 18 more

Could you please advise if I am doing something wrong or just enabling conpression in config is not enough to make it work?

P.S.: both examples are on Java 7, configuration is a little beyong basic with single map with indices and OBJECT in-memory format (tried changing it to Binary - result is the same).

Appreciate your help.

Regards,
Vadym

Enes Akar

unread,
Dec 13, 2015, 4:46:05 PM12/13/15
to haze...@googlegroups.com
It may be a bug. Can you report the issue here:

Providing a test code to reproduce issue will be awesome.


--
You received this message because you are subscribed to the Google Groups "Hazelcast" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hazelcast+...@googlegroups.com.
To post to this group, send email to haze...@googlegroups.com.
Visit this group at http://groups.google.com/group/hazelcast.
To view this discussion on the web visit https://groups.google.com/d/msgid/hazelcast/a941ad03-4bce-4e9d-abbb-e912f6a79572%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages