Protocol buffers limit the parsed size to 64 MB by default. You have
generated a very large message. You either need to set the limit
larger, or split your message into multiple messages. See:
http://code.google.com/apis/protocolbuffers/docs/techniques.html#large-data
Hope this helps,
Evan
Ah, right. With the C++ API, the intention is that you will not reuse
the CodedInputStream, and instead it will be created and destroyed for
each message. It is very cheap to allocate / destroy if it is a local
variable.
In your case, you should do something like change your ::write method
to do:
CodedOutputStream out(_raw_out.get());
out.WriteVarint32(event.ByteSize());
event.SerializeWithCachedSizes(&out);
This will also save the extra copy that your code currently has. Hope
Not true: Creating a CodedInputStream does not change the position in
the underlying stream. Your code can easily look like:
while (still more messages to read) {
CodedInputStream in(&input_stream);
in.Read*
...
msg.ParseFromCodedStream();
}
This creates and destroys the CodedInputStream for each message, which
is efficient.
> Unfortunately, reading does not work out after 2^31 bytes are read.
> Is there a way around?
You will need to destroy and re-create the CodedInputStream object. If
you don't want to do it for each message, you need to at least do it
occasionally.
Evan