Contact emails
Spec
Summary
getSynchronizationSources() allow inspecting when the last RTP packet (audio/video frames) was sent out for playout.
We've already shipped the more edge-case version of this API, getContributingSources(), which is used when a single RTP stream is used to represent multiple sources.
Both APIs are useful for real-time information about active streams, such as implementing an audio-meter when remote participants are talking.
This intent also covers expanding both APIs (getSynchronizationSources() and getContributingSources()) to work for both audio and video receivers. The previous implementation was audio-only. The video use case is still useful because of the timestamps.
Is this feature supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android and Android WebView)?
Yes.
Risks
None
Is this feature fully tested by web-platform-tests?
No but I'm adding tests as I'm implementing this.
Entry on the feature dashboard