WebRTC VideoStream

265 views
Skip to first unread message

Jan-Phillip Lutze

unread,
Aug 17, 2018, 5:27:57 PM8/17/18
to discuss...@googlegroups.com

Hey all,

 

I have a question about videostreaming. First my Idea: I'd like to write a framegenerator which takes yuv files from my disk and puts them into a frame (done). The generator creates a frame every 40ms  that should be streamed and send to someone else. Now my question, what is the best way to do it? I'm using c++ with qt, if you need that information. I found some examples on the web (mostly from 2016 or older and they used deprecated stuff). In my main program is a class which inherits from cricket::VideoCapturer and at this point I am struggling. If you have any examples or just hints for me I would be really thankful.

 

Best regards,

Jan-Phillp Lutze

Jeremy Lainé

unread,
Aug 19, 2018, 5:39:59 AM8/19/18
to discuss-webrtc
Hi Jan-Phillp,

I realize you are asking for a C++ solution, but if you're also comfortable with Python you could use aiortc, a Python implementation of WebRTC. The key selling point here is that it's trivial to produce or manipulate frames using OpenCV's extensive toolbox and send them over a video stream.

The "server" example for instance does frame-by-frame manipulations and sends them over a video stream:


If you want to use apprtc for signaling you can also look at the "apprtc" example, which sends an image with an animated rotation:


The internal format for VideoFrame is YUV, but some helpers are also provided to create frames from BGR or Grayscale data:


Cheers,
Jeremy

JPL

unread,
Aug 20, 2018, 9:38:14 AM8/20/18
to discuss-webrtc
Hey Jeremy,

First thanks for your answer but I prefer to use C++. If you have a hint therefor that would be awesome. 

Cheers,
Jan-Phillip
Reply all
Reply to author
Forward
0 new messages