The `track.stats` API allows an application to measure quality related to the capturing of a MediaStreamTrack (getUserMedia). This API has already shipped for video tracks (Chrome Status, Intent to Ship). This intent relates to the audio version of the same API, which has similar metrics plus input latency metrics.
Quality measurements are important to understand user reports that app gets (e.g. in Google Meet, users may file bugs containing quality dumps) and A/B experiments to understand how features impact quality (e.g. adding heavy video processing in an app may impact audio quality). Latency may be useful for audio processing. Together with WebRTC metrics, capture metrics help provide context as to which parts of the pipeline are contributing to quality in what way.
Risk is relatively small since this is a stats API. The MediaStreamTrack functions whether or not you can measure quality related properties of the track.
None
None
Contact emailsSpecificationhttps://w3c.github.io/mediacapture-extensions/#the-mediastreamtrackaudiostats-interface
SummaryThe `track.stats` API allows an application to measure quality related to the capturing of a MediaStreamTrack (getUserMedia). This API has already shipped for video tracks (Chrome Status, Intent to Ship). This intent relates to the audio version of the same API, which has similar metrics plus input latency metrics.
The spec is actively under development. It currently contains frame counters (like the video counterpart) which in the audio case allows calculating ratio of dropped audio which is a measure of capture glitches. There is also a PR in review which will add current input latency and a follow-up issue to add min/max/avg latency.
Blink componentBlink>GetUserMedia
Motivation
The motivation is similar to that of video stats, but this time it is audio related.Quality measurements are important to understand user reports that app gets (e.g. in Google Meet, users may file bugs containing quality dumps) and A/B experiments to understand how features impact quality (e.g. adding heavy video processing in an app may impact audio quality). Latency may be useful for audio processing. Together with WebRTC metrics, capture metrics help provide context as to which parts of the pipeline are contributing to quality in what way.
TAG review statusN/A small addition to existing spec and the `track.stats` API shape has already shipped for video.
RisksInteroperability and CompatibilityRisk is relatively small since this is a stats API. The MediaStreamTrack functions whether or not you can measure quality related properties of the track.
Gecko: No signal
WebKit: No signal
Web developers: Positive
Other signals:
WebView application risksNone
DebuggabilityNone
Is this feature fully tested by web-platform-tests?Test coverage will be added as part of implementation, including which metrics are supported by the browser. In addition to WPTs, correctness of quality metrics may require browser tests e.g. for fake devices.
Link to entry on the Chrome Platform Status
--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+...@chromium.org.
To view this discussion on the web visit https://groups.google.com/a/chromium.org/d/msgid/blink-dev/bb6c1af3-9eb3-4c6f-a136-dee709b7f906n%40chromium.org.
On Mon, Nov 13, 2023 at 4:04 PM Henrik Boström <hb...@chromium.org> wrote:
Contact emails
Specificationhttps://w3c.github.io/mediacapture-extensions/#the-mediastreamtrackaudiostats-interface
SummaryThe `track.stats` API allows an application to measure quality related to the capturing of a MediaStreamTrack (getUserMedia). This API has already shipped for video tracks (Chrome Status, Intent to Ship). This intent relates to the audio version of the same API, which has similar metrics plus input latency metrics.
The spec is actively under development. It currently contains frame counters (like the video counterpart) which in the audio case allows calculating ratio of dropped audio which is a measure of capture glitches. There is also a PR in review which will add current input latency and a follow-up issue to add min/max/avg latency.
Blink componentBlink>GetUserMedia
Motivation
The motivation is similar to that of video stats, but this time it is audio related.Quality measurements are important to understand user reports that app gets (e.g. in Google Meet, users may file bugs containing quality dumps) and A/B experiments to understand how features impact quality (e.g. adding heavy video processing in an app may impact audio quality). Latency may be useful for audio processing. Together with WebRTC metrics, capture metrics help provide context as to which parts of the pipeline are contributing to quality in what way.
TAG review statusN/A small addition to existing spec and the `track.stats` API shape has already shipped for video.
RisksInteroperability and CompatibilityRisk is relatively small since this is a stats API. The MediaStreamTrack functions whether or not you can measure quality related properties of the track.
Gecko: No signal
WebKit: No signal
Have we reached out?
Web developers: PositiveAny links?
Other signals:
WebView application risksNone
DebuggabilityNone
Is this feature fully tested by web-platform-tests?Test coverage will be added as part of implementation, including which metrics are supported by the browser. In addition to WPTs, correctness of quality metrics may require browser tests e.g. for fake devices.
Link to entry on the Chrome Platform Status
--
You received this message because you are subscribed to the Google Groups "blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to blink-dev+unsubscribe@chromium.org.
Web developers: PositiveAny links?I don't have any links but I've been told that web developers have been "asking for input latency metrics for years".On behalf of Google, Google Meet web developers are asking for both dropped frames ratio metrics and input latency metrics, so I am developing these PRs together with the WebRTC audio experts who requested them.
Zoom is interested in this as well - it is critical for us to understand where in our audio pipeline that frames have been dropped.
Would this be supported for audio tracks in all MediaStream contexts, or just the stream obtained via `getUserMedia`? In particular, I would love if we could also get these statistics for audio tracks on a stream obtained via WebAudio's `createMediaStreamDestination` (https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamDestination).