Can we serialize 6 GB data with protobuf

107 views
Skip to first unread message

Samuel Joshua

unread,
Aug 28, 2024, 7:05:20 AM8/28/24
to Protocol Buffers
Hi,

I have a python object which when serielized using jsonpickle or the python pickle library to json gives me 6 GB size. I would like to try protobuf instead, would it be supported? I read in the overview (https://protobuf.dev/overview/#solve) that it only supports only a few MB in size data for serializations. Please do let me know.

Samuel Benzaquen

unread,
Aug 28, 2024, 2:40:57 PM8/28/24
to Samuel Joshua, Protocol Buffers
On Wed, Aug 28, 2024 at 7:05 AM Samuel Joshua <isamue...@gmail.com> wrote:
Hi,

I have a python object which when serielized using jsonpickle or the python pickle library to json gives me 6 GB size. I would like to try protobuf instead, would it be supported? I read in the overview (https://protobuf.dev/overview/#solve) that it only supports only a few MB in size data for serializations. Please do let me know.

Protocol Buffer serialization is limited to 2GiB payloads. This is due to how sizes are encoded in the format and the runtime APIs.

There are ways to encode longer payloads, but it requires extensive care.
If your payload is large because you have a list of smaller objects, then you can serialize the individual objects on their own and use some other way to pack them together.

--
You received this message because you are subscribed to the Google Groups "Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to protobuf+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/protobuf/3811cc17-c999-4b5a-91b9-65f4aa4118ccn%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages