The spec for input latency
https://www.w3.org/TR/mediacapture-streams/#def-constraint-latency does seem to be the full path, which is what we are after.
```
The latency or latency range, in seconds. The latency is the time between start of processing (for instance, when sound occurs in the real world) to the data being available to the next step in the process. Low latency is critical for some applications; high latency may be acceptable for other applications because it helps with power constraints. The number is expected to be the target latency of the configuration; the actual latency may show some variation from that.
```
However, looking at this using a fiddle
https://jsfiddle.net/2oafnkbh/1/the latency numbers are close to what looks like an internal buffer size.
on max osx, chromium, we see
latency*samplerRate ~= 128
regardless of sampleRate
whereas we'd expect something like 128+X where X is the additional latency imposed by the internal buffering/system/hardware etc.
(tangentially, this was mentioned on a w3c workshop on media production, see
https://github.com/w3c/media-production-workshop/issues/31)