Hi.
Everything is working great is an all in one Vertx app. But when using clustering getting the above exception in the event bus (no flow through errors to handlers) when using Hazelcast clustering. Since some of the data being passed by JSON is LARGE I am assuming that data is getting truncated and causing this. Since then same error happens in HTTP if you don't set Receive/Send buffer size big enough I though that setting them in Hazelcast like:
Would fix things it does not. Error looks like:
SEVERE: Unhandled exception
org.vertx.java.core.json.DecodeException: Failed to decode:Unexpected character ('c' (code 99)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: java.io.StringReader@4d7f405e; line: 1, column: 18]
at org.vertx.java.core.json.impl.Json.decodeValue(Json.java:63)
at org.vertx.java.core.json.JsonObject.<init>(JsonObject.java:67)
at org.vertx.java.core.eventbus.impl.JsonObjectMessage.readBody(JsonObjectMessage.java:55)
at org.vertx.java.core.eventbus.impl.BaseMessage.<init>(BaseMessage.java:306)
at org.vertx.java.core.eventbus.impl.JsonObjectMessage.<init>(JsonObjectMessage.java:43)
at org.vertx.java.core.eventbus.impl.MessageFactory.read(MessageFactory.java:70)
at org.vertx.java.core.eventbus.impl.DefaultEventBus$2$1.handle(DefaultEventBus.java:587)
at org.vertx.java.core.eventbus.impl.DefaultEventBus$2$1.handle(DefaultEventBus.java:580)
at org.vertx.java.core.parsetools.RecordParser.parseFixed(RecordParser.java:200)
at org.vertx.java.core.parsetools.RecordParser.handleParsing(RecordParser.java:158)
at org.vertx.java.core.parsetools.RecordParser.handle(RecordParser.java:214)
at org.vertx.java.core.parsetools.RecordParser.handle(RecordParser.java:50)
at org.vertx.java.core.net.impl.DefaultNetSocket.handleDataReceived(DefaultNetSocket.java:267)
at org.vertx.java.core.net.impl.VertxNetHandler.channelRead(VertxNetHandler.java:48)
at org.vertx.java.core.net.impl.VertxNetHandler.channelRead(VertxNetHandler.java:32)
at org.vertx.java.core.net.impl.VertxHandler.channelRead(VertxHandler.java:156)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:332)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:318)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:125)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:507)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:464)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:378)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:350)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
at java.lang.Thread.run(Thread.java:745)
Any ideas?? And yes the Json objects are LARGE (thank you hashed values).