Python BiDi server side implementation questions

328 views
Skip to first unread message

Akhilesh Raju

unread,
Aug 3, 2021, 6:11:07 PM8/3/21
to grpc.io
Hi all,

At a very high level, I have a BiDi stream between a python grpc server and node grpc client.
The client sends a request to the server to start streaming of data back to the client. At any time, the client can decide it wants to stop the streaming of data from the server and send a request to the server to do the same.
For example purposes, the data streamed back by the server is a simple str representation of the current timestamp.
The proto file and server python code can be found here - GitHub Gist

Here is what I would ideally like to do
  1. Send stream start request from client to server
  2. Server receives it and starts sending back data as and when its produced, without any intervention from the client
  3. After some time T, the client decides that it wants to stop the stream and sends a stop request to the server
  4. The server received the request and stop sending any more data.
The def TimeStream function is provided with 2 things - a request iterator and context. When I try to fetch requests from the request_iterator via next(request_iterator), if there are no pending requests, the next() statement blocks till a new request comes in.
To continuously stream data back from the server and also process incoming requests from the client, I had to create a new thread to fetch requests through the request_iterator, allowing the originial thread the request came on, free to "yield" values.
This can be seen in lines 80-135 in the server.py file

With this in mind, here are the questions I have
  • Is there a non-blocking way to check if any new requests came in, without having to create a seperate thread?
  • Is there a way to attach an observer to the stream to process requests as they come in?
  • Is there a way to get hold of a stream "object" that I can use to send responses back from a different thread? From what I have seen, "yield" seems to be the only way to stream data back from server to client.


Lidi Zheng

unread,
Aug 4, 2021, 1:44:40 PM8/4/21
to grpc.io
I would recommend to try the gRPC AsyncIO API, it solves all 3 problems you are asking. Here is an example containing the bidi server-side code:


> Is there a non-blocking way to check if any new requests came in, without having to create a separate thread?

In AsyncIO world, you can create a coroutine to consume the requests.


> Is there a way to attach an observer to the stream to process requests as they come in?

The request-consuming coroutine can also invoke the logic you want for every incoming request.

> Is there a way to get hold of a stream "object" that I can use to send responses back from a different thread? From what I have seen, "yield" seems to be the only way to stream data back from server to client.

The AsyncIO API's `context` object has `context.read()` and `context.write()` method that one can carry the logic of an RPC to another coroutine.

---

For sync/normal API:

> Is there a non-blocking way to check if any new requests came in, without having to create a seperate thread?

No. We don't have an API for checking.

> Is there a way to attach an observer to the stream to process requests as they come in?

Server interceptors might do the trick for you.

> Is there a way to get hold of a stream "object" that I can use to send responses back from a different thread? From what I have seen, "yield" seems to be the only way to stream data back from server to client.

You need to build your own pipe (Pipe example).



Reply all
Reply to author
Forward
0 new messages