How to analyze potential memory leak in Grpc-Java

390 views
Skip to first unread message

Sammi Chen

unread,
Nov 24, 2021, 10:52:34 PM11/24/21
to grpc.io
There is a lot "io.grpc.StatusRuntimeException: CANCELLED: client cancelled" error in server side.
And occationally,  I will get "io.netty.util.internal.OutOfDirectMemoryError: failed to allocate *** byte(s) of direct memory (used: ***, max: ***)" error on server side too.

Set "io.netty.leakDetection.level=paranoid",  there is no leak related LOGs.

Server side function looks like this,

@Override
public StreamObserver<CommandRequestProto> send(
StreamObserver<CommandResponseProto> responseObserver) {
return new StreamObserver<CommandRequestProto>() {

@Override
public void onNext(CommandRequestProto request) {
try {
CommandResponseProto resp =
function(request);
responseObserver.onNext(resp);
} catch (Throwable e) {
LOG.error("Got exception when processing"
+ "CommandRequestProto {}", request, e);
responseObserver.onError(e);
}
}

@Override
public void onError(Throwable t) {
// for now we just log a msg
LOG.error("Command send on error. Exception: ", t);
}

@Override
public void onCompleted() {
LOG.debug("Command send completed");
responseObserver.onCompleted();
}

};

My question is SHOULD I call responseObserver.onCompleted() in onError too?
I found there is inconsistent answers on stack overflow,

This one
Says that onError/onCompleted should not call requestObserver.

While this one
Says if you don't call requestObserver.onCompleted, there will be a memory leak.

I'm not sure which one is corrent.  I'd like to hear the more authoritative answer.
Should server side request StreamObserver.onError/onCompleted call response's StreamObserver.onCompleted?

Thanks in advance.


sanjay...@google.com

unread,
Nov 30, 2021, 3:19:40 PM11/30/21
to grpc.io
> My question is SHOULD I call responseObserver.onCompleted() in onError too?

It will be a good idea to call responseObserver.onError() inside your server's request-observer e.g.

@Override
public void onError(Throwable t) {
responseObserver.onError(t);   // or alternatively return a fixed throwable instead of the throwable you received 

LOG.error("Command send on error. Exception: ", t);
}

The reason it's a good idea is because even though gRPC code doesn't need it you could have some interceptor code that leaks resources in such situations.

> I found there is inconsistent answers on stack overflow,

One of the links is for the client side so that logic doesn't apply here.

> And occationally,  I will get "io.netty.util.internal.OutOfDirectMemoryError: failed to allocate *** byte(s) of direct memory (used: ***, max: ***)" error on server side too.

This is unlikely to be caused by the above but could possibly be because of lack of flow control in your implementation.
Reply all
Reply to author
Forward
0 new messages