I'm absolutely at my witts end after 20+ hours dedicated to attempting to debug this one problem alone
Please read carefully before replying, since the obvious solutions do not apply here.
First, the application gets very good latency by default. This problem is in response to a simulated pause of the video stream (which would be possible using our application).
When I pause the sender for 4-8 seconds, upon resuming, the connection recovers. However at that point, there is an introduction of a permanent 100-200 ms latency that never goes away. Its almost as if the browser has decided to allow being behind the video and has no way to "skip to live" ever again.
The biggest indication that something is wrong on the browser side is the ratio of totalProcessingDelay to framesDecoded, which before the pause is < 10 ms and after the pause goes 100-200.
Here are all the things I've tried and culprits I've ruled out
It is not
- The socket data connection, latency from browser to message receive remains low
- The sample timestamps. Latency from sample generation to webrtc send is effectively 0
- The video encoder itself (there is no buffering of frames occuring server-side)
- pLayoutHint or whatever its called is null, 0, doesn't matter what i set it to, never works
- A "send rate" issue. I tried sending empty RTP "frames" during the pause, it has no effect.
Someone please help me. There has to be some way to "soft reset" the video when this happens so the browser recovers. I tried doing iceRestart(), but nothing really happens. And even then, that feels like a drastic solution even if it did work.
I could hack some things in place to make the video sender always send still frames during a pause, but that would drastically increase the complexity of the application i'm working on. There HAS to be a better way.