Hi,
I'm implementing a uni directional server side streaming in Java with performance/low latency in mind, using Netty. I don't want to implement by own flow control per se but I'm more interested to write thread safe code given the existing framework. And I got a few questions..
1) Is it important to look at isReady() before calling onNext(..)?
If the answer to 1 is no, then we can stop here.
If the answer to 1 is yes, then, let's consider code A)
Code A)
ConcurrentLinkedQueue q;
void start() {
obs.setOnReadyHandler(this::drainQueue);
}
void send(Msg msg){
if (obs.isReady()){
obs.onNext(msg);
} else {
q.offer(msg);
}
}
// run by grpc threads
void drainQueue(){
while(obs.isReady() && !q.isEmpty()){
obs.onNext(msg);
}
}
This pattern isn't correct because:
1) Consider the case where send is called and isReady is false and now before q.offer is executed, the ready is flipped from false to true and drainQueue is run with no message in it. Then q.offer is executed. Now my msg is stuck inside this queue.
2) Also onNext is called by two threads (my thread and grpc thread) and it's easy to see that my message can be sent out of ordered.
What if my thread never calls onNext?
This won't work neither because what if the obs is always ready. Then my message will always get stuck in the queue?
So is there some sample or typical pattern to implement server side streaming within the existing flow control framework. Do I need some Atomic operation?
Thanks,
- Mag