Hi WebRTC,
As many of you may know - the good people doing Chromium have started an early implementation of the MediaStreamRecording API:
One of the issues that we are looking at is time synchronising video, audio and data from multiple sources and then recording it to disk for synchronous playback. At the moment we are writing a native WebRTC app to perform this, but the prospect of having the MediaStreamRecorder in a verson of Chrome or Canary soon (?) could really simplify this.
So I have a few questions:
1) Is there any way to sync the data sent via a DataChannel on a peer connection with the video stream attached to that peer connection (I think no?) ? Either with frame number or time-stamp? Is there any plan to allow this at some point in the future of WebRTC?
2) Is anyone from the WebRTC side discussing the possibility of allowing time-stamping of video frames from source through to recording? Especially considering how the standard for MediaStreamRecording is still being fleshed out?
Thanks,
Leighton.