Hi, I am new to the WebRTC world as a web developer. Recently I am working on a cloud gaming project which seems to have a weird latency issue.
So the scenario is, the streaming look normal at beginning, and once the first sender report for audio track is received, the latency will start to increase and I can see the experimental minPlayoutDelay show up in the webrtc-internals.
To investigate this issue, I have build chromium and try to play with the synchronizer and RTCP receiver. First I found out that the latency comes from rtp_streams_syncronizer2.cc::UpdateDelay call. Secondly, I found out that if I skip the RTCP packet handler in channel_receive.cc (or specifically the sender report handler), the synchronizer will not invoke UpdateDelay, and my issue seems resolved without a/v sync issue.
After reading some documents about a/v sync in WebRTC, I do think the sender report and synchronizer are reasonable parts in this case.
So my question is, why the extra latency show up in this case (considering I can just simply turn off the RTCP handler in audio processing without weird behaviour). Is it maybe because the clock is not sync in video and audio capturing?