Using google.cloud.ndb (in Python 3.6.6) getting RESOURCE_EXHAUSTED: gRPC message exceeds maximum size 4194304: 13208641 while trying to write data to Google Cloud Datastore

35 views
Skip to first unread message

sudipt...@decagames.com

unread,
Feb 27, 2020, 9:11:52 AM2/27/20
to grpc.io
Getting error in Python 3.6.6 where I am trying to write data using ndb.put_multi(chunks) where ndb is google.cloud.ndb from  google-cloud-ndb==1.0.1

   from google.cloud import ndb

We are getting the following error while doing ndb.put_multi(chunks). where we have sys.getsizeof(chunks) is 200 (bytes)

[datastore] Feb 27, 2020 6:53:21 PM io.grpc.netty.NettyServerStream$TransportState deframeFailed

[datastore] WARNING: Exception processing message

[datastore] io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: gRPC message exceeds maximum size 4194304: 13208641

[datastore] at io.grpc.Status.asRuntimeException(Status.java:521)

[datastore] at io.grpc.internal.MessageDeframer.processHeader(MessageDeframer.java:387)

[datastore] at io.grpc.internal.MessageDeframer.deliver(MessageDeframer.java:267)

[datastore] at io.grpc.internal.MessageDeframer.request(MessageDeframer.java:161)

[datastore] at io.grpc.internal.AbstractStream$TransportState.requestMessagesFromDeframer(AbstractStream.java:205)

[datastore] at io.grpc.netty.NettyServerStream$Sink$1.run(NettyServerStream.java:100)

[datastore] at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)

[datastore] at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)

[datastore] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)

[datastore] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)

[datastore] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)

[datastore] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)

[datastore] at java.base/java.lang.Thread.run(Thread.java:830)


I am running datastore emulator in my local using gcloud beta emulators datastore start command.

Thanks in advance for your help.

Lidi Zheng

unread,
Feb 27, 2020, 1:32:38 PM2/27/20
to grpc.io
gRPC has a 4GB hard limit for the size of message, and ProtoBuf has a 2GB hard limit.

I wonder what's the real size of the `chunks`, sys.getsizeof is not accounting for all content in the container.

If possible, can you break the data into multiple messages?

sudipt...@decagames.com

unread,
Feb 28, 2020, 6:15:45 AM2/28/20
to grpc.io
Hi Lidi, thanks for your response. 
But from the message gRPC message exceeds maximum size 4194304: 13208641  it seems max allowed limit is 4mb and entire chunk size is 13mb.

Lidi Zheng

unread,
Feb 28, 2020, 1:01:09 PM2/28/20
to sudipt...@decagames.com, grpc.io
My bad, if you want to increase the maximum size of outbound messages, you can set this channel argument "grpc.max_send_message_length".

--
You received this message because you are subscribed to the Google Groups "grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email to grpc-io+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/acbda9ff-5be0-40e4-ae96-39caecac4163%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages