Hey everyone,
I have a problem that I am unsure what to do about. I am creating a client for a server. The server takes in a stream of file chunks and when the client stream completes, the server sends back a single Empty response. The Protobuf definition looks like this:
rpc PutFile(stream PutFileRequest) returns (google.protobuf.Empty) {}
Each PutFileRequest contains a chunk of the file. See the full definition here:
My issue is this. If I have a very large file, say 100gb and I only have 4gb of memory and I want to send my file across the network, I have no way to backpressure the system to avoid out of memory issues. Here is why:
We use an instance of StreamObserver<PutFileRequest> from the client to send chunks of the file. I have a process that is reading chunks of the file from disk as fast as it can and sends them over the StreamObserver<PutFileRequest> object. I have no way to know when the request has been sent over the network and the memory for a chunk has been released. Because of this, I will easily encounter an out of memory error as I am reading from the disk at a much faster rate than I can send over the network.
Is there a way to be notified from gRPC when a specific message in a stream has completed sending? I could listen for this event and slow down the rate at which I am reading from disk accordingly.
Thank you,
Hollin Wilkins