Stream local yuv file instead of camera in Native WebRTC (c++)

797 views
Skip to first unread message

Rıza Arda Kırmızıoğlu

unread,
Apr 4, 2018, 8:55:17 AM4/4/18
to discuss-webrtc
Dear all,
I want to stream a local yuv or y4m file from my desktop as if I am using virtual webcam for my tests. Could you please give me some code/class clues in order to simulate this system.
I think, I need to use something like FakeVideoCapturer*, however, I want to point some video on my desktop with a certain resolution and framerate. Again, I am swimming in the code, could you please give me some clue with code examples.
Thank you very much for your time.

Niels Moller

unread,
Apr 5, 2018, 10:08:50 AM4/5/18
to discuss...@googlegroups.com
You may want to have a look at the FakeVideoTrackSource and
FakePeriodicVideoSource classes I added recently. Part of the process
of migrating webrtc testcode away from the deprecated
cricket::VideoCapturer.
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "discuss-webrtc" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to discuss-webrt...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/discuss-webrtc/528a7422-97fb-4958-af59-5500b4ce0754%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Rıza Arda Kırmızıoğlu

unread,
Apr 6, 2018, 4:59:20 AM4/6/18
to discuss-webrtc
Hello Moller,
First of all thank you for your answer. How should I call these classes in order to create a fake video_track?

local_video_track_(VideoTrack::Create("LocalVideoTrack", FakeVideoTrackSource::Create(false))

I do not want to apply screencast. I have a video with a special content in it, and I want to stream my local.yuv file by pointing it. Should I give the path of "local.yuv" file to the field in this call "LocalVideoTrack". I need this kind of detail. 
Thank you for your time

Chen Cong

unread,
Apr 8, 2018, 5:09:23 AM4/8/18
to discuss-webrtc
https://github.com/webrtc/samples/issues/616

It may be helpful. AFAIK, y4m file can be played as a stream in Android .If you want to implement it by native code , I guess you can reference the java code.

在 2018年4月4日星期三 UTC+8下午8:55:17,Rıza Arda Kırmızıoğlu写道:

Niels Moller

unread,
Apr 9, 2018, 3:36:19 AM4/9/18
to discuss...@googlegroups.com
On Fri, Apr 6, 2018 at 10:59 AM, Rıza Arda Kırmızıoğlu
<rkirmi...@ku.edu.tr> wrote:
> Hello Moller,
> First of all thank you for your answer. How should I call these classes in
> order to create a fake video_track?

These classes don't implement reading frames from a file. You'd need
to do that yourself, interfacing to the video pipeline in the same
way. I'd suggest starting with the FakePeriodicVideoSource, and add
code to read your frames from the right file.

See this line, https://cs.chromium.org/chromium/src/third_party/webrtc/pc/test/fakeperiodicvideosource.h?q=FakePeriodic&sq=package:chromium&l=54

sink_->OnFrame(frame_source_.GetFrame());

Here, frame_source_ is a test class that just produces blank frames
with appropriate size and time stamps. You'd need to substitute code
to get your frame data.

Also the TaskQueue thing is an internal webrtc facility, for your code
it might be better to spawn a normal thread to do this work.

You can use the VideoTrackSource class to wrap any
VideoSourceInterface, which is how FakePeriodicVideoSource is used,
there's one example in
https://cs.chromium.org/chromium/src/third_party/webrtc/ortc/ortcfactory_integrationtest.cc?type=cs&q=FakePeriodicVideoSource&sq=package:chromium&l=227
(this uses OrtcFactory, but PeerConnectionFactory is the same in this
respect). The VideoTrackSource is then passed to
PeerConnectionFactory::CreateVideoTrack.

That might be one more layer than you need; you can also choose to let
your code implement VideoTrackSourceInterface directly, rather than
the more generic VideoSourceInterface. Then you don't need the
VideoTrackSource class.

Regards,
/Niels
Reply all
Reply to author
Forward
0 new messages