Thanks for answering the question!I have one concern on blocking the onNext call.I assume blocking onNext call blocks grpc executor pool.Say I have a high QPS for streaming RPC, and one rpc request sends a lot more streams than the others, will this starve the grpc executor pool and block the other RPCs as well?
Yes. I did see the manual flow control. Can you comment on the pros and cons of using manual vs automatic?
Thanks again for helping out!On Sunday, December 18, 2022 at 5:11:59 PM UTC-7 sanjay...@google.com wrote:Take a look at https://github.com/grpc/grpc-java/tree/master/examples/src/main/java/io/grpc/examples/manualflowcontrol and disableAutoRequest() for manual flow control.I think it is okay to block on onNext() call for automatic flow control. Check this https://grpc.github.io/grpc-java/javadoc/io/grpc/stub/CallStreamObserver.html#disableAutoInboundFlowControl-- which says "automatic flow control where a token is returned to the peer after a call to the 'inbound' StreamObserver.onNext(Object) has completed."On Friday, December 16, 2022 at 4:23:58 PM UTC-8 liuya...@gmail.com wrote:Hi,I am trying to use bidi streaming RPC for multiple large chunks of data.I was wondering how to flow control is done for bidi streaming java.If I want to apply backpressure from server side, can I just block on onNext() call? My understanding is this should be a grpc executor task I shouldn't block.Does this mean I have to do the manual control flow to apply back pressure and there is no default option?Thanks!
--
You received this message because you are subscribed to the Google Groups "grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email to grpc-io+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/bb149d94-de2f-4b79-b641-b4c03c3ae7fcn%40googlegroups.com.
Wondering if there is limited token by default for auto flow control?
If not, say I have 20 threads, one rpc with 20 streams with blocking onNext() will block 20 threads?
Wondering if there is limited token by default for auto flow control?I guess there has to be a limit for backpressure to work. Are you asking if it's possible to change the token limit?
If not, say I have 20 threads, one rpc with 20 streams with blocking onNext() will block 20 threads?One rpc is one stream. Unless you are saying there are 20 incoming rpc instances of the same method in which case there will be 20 streams. You are right: it will take up 20 threads.
--
You received this message because you are subscribed to the Google Groups "grpc.io" group.
To unsubscribe from this group and stop receiving emails from it, send an email to grpc-io+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/0522c6a9-dd97-4c82-a9a8-d24445ab923cn%40googlegroups.com.
Wondering if there is limited token by default for auto flow control?I guess there has to be a limit for backpressure to work. Are you asking if it's possible to change the token limit?Yeah. I am asking what the limit is by default. If I want to change it, I have to do manual flow control I suppose?
If not, say I have 20 threads, one rpc with 20 streams with blocking onNext() will block 20 threads?One rpc is one stream. Unless you are saying there are 20 incoming rpc instances of the same method in which case there will be 20 streams. You are right: it will take up 20 threads.Sorry for the confusion. I meant for StreamingRPC. Say one rpc client (one rpc method) calls onNext() 10 times. and each onNext() for the rpc server side is blocking. How many threads will this rpc block?
--
You received this message because you are subscribed to a topic in the Google Groups "grpc.io" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/grpc-io/7LV3MC9Dwkg/unsubscribe.
To unsubscribe from this group and all its topics, send an email to grpc-io+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/grpc-io/d0de588d-ef5b-4fad-a735-5d513e4abaddn%40googlegroups.com.
If service a and service b are implemented on the same server, do they share the same thread pool?
On Thu, Dec 22, 2022 at 1:47 PM y <liuya...@gmail.com> wrote:If service a and service b are implemented on the same server, do they share the same thread pool?Yes. But you can provide a threadpool/executor per server : ServerBuilder.executor()