play the webrtc stream with gstreamer

1,737 views
Skip to first unread message

Daniel Roviriego

unread,
Jul 21, 2015, 1:07:20 PM7/21/15
to meetech...@googlegroups.com
Hi All.

Is it possible to grab one or many streams from a MCU session , for example, and play them with a gstreamer pipeline ? @Computician and @lminiero did help me on github and showed me the rtp_listen feature for MCU but I could not realize how to work with the feature. (do I nedd to curl with the JSON to enable the forwarding ?) I 'm a newby around webrtc, I apprecciate more detailed info on that and if possible, the pipelines, as well. (My goal is forwarding clean feed from MCU to a video mixer, for tv production!)

Thanks again for such great and lively project

Benjamin Trent

unread,
Jul 21, 2015, 1:57:28 PM7/21/15
to meetech...@googlegroups.com
I have a separate API wrapper or calls that I make to either the websocket interface or the HTTP REST interface to make the request for the streams. 

The HTTP REST request looks like this:
{
  "body":
   {
      "request":"rtp_forward",
       "publisher_id":100,
       "room":5,
       "host":"192.168.0.17",
       "audio_port":5000,
       "video_port":5002
   },
   "janus":"message",
   "transaction":"o4vup0qoomd"
}
Requesting that publisher 100 in room 5 forwards video rtp packets to port 5002 on host 192.168.0.17, audio rtp packets to 5000.
This is what you will receive from it.
{
   "janus": "success",
   "session_id": 3855846475,
   "sender": 4003334502,
   "transaction": "o4vup0oqoomd",
   "plugindata": {
      "plugin": "janus.plugin.videoroom",
      "data": {
         "publisher_id": 100,
         "rtp_stream": {
            "audio_stream_id": 123113458,
            "audio": 5000,
            "video_stream_id": 584326990,
            "video": 5002,
            "host": "192.168.0.17"
         },
         "room": 5,
         "videoroom": "rtp_forward"
      }
   }
}

If you want to use the websocket interface you will have to add the session_id and handle_id fields to the request and fill them out accordingly.

What programming language are you using? I already have a crude C# api wrapper(it is very crude...and ugly but it works for me well enough). You could also simply utilize the javascript API wrapper.


As for gstreamer, you can do multiple things. 

What I do to receive the stream.

"udpsrc port=5000 caps="application/x-rtp, media=(string)audio, payload=(int)111, clock-rate=(int)48000, encoding-name=(string)X-GST-OPUS-DRAFT-SPITTKA-00\" ! rtpjitterbuffer do-lost=true latency=300 mode=0 ! rtpopusdepay ! queue ! opusparse ! opusdec ! audioresample ! audioconvert ! appsink sync=false caps="audio/x-raw, format=S16LE, layout=interleaved, rate=8000, channels=1" name=audio_sink_1 udpsrc port=5002 caps="application/x-rtp, media=(string)video, payload=(int)100, clock-rate=(int)90000, encoding-name=(string)VP8-DRAFT-IETF-01" ! rtpjitterbuffer do-lost=true latency=300 mode=0 ! rtpvp8depay ! queue ! vp8dec ! videoconvert ! videoscale ! videorate ! video/x-raw, format=(string)RGB, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)16/1 ! appsink sync=false name=video_sink_1 caps="video/x-raw, format=(string)RGB, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive"

Make the caps anything you would like and you obivously don't have to push to an appsink. You can do whatever you like. 

I actually take the feeds and put then through a compositor and a audiomixer, encode them again, multiplex them, and write to a file. 

Let me know if you have any specific questions to anything I have said(typed :))

Daniel Roviriego

unread,
Jul 21, 2015, 3:26:56 PM7/21/15
to meetecho-janus
HI Benjamin.

Many thanks for the quick and complete reply! you rocks!

I 'm still stuck with the REST command: I could not find out what is the publisher_id . I 'm going through the easy way with curl. what is wrong with that command ?
 
curl -H "Content-Type: application/json" -X POST -d '{"body":{"request":"rtp_forward","publisher_id":100,"room":1234,"host":"10.1.110.22","audio_port":5000,"video_port":5002},"janus":"message","transaction":"o4vup0qoomd"}' http://localhost:8088/janus/

the room number as it ś definied in the conf file; the publisher_id is unknown to me..

the response is:

{
   "janus": "error",
   "transaction": "o4vup0qoomd",
   "error": {
      "code": 457,
      "reason": "Unhandled request 'message' at this path"
   }
}

what to do ?
many thanks again

Benjamin Trent

unread,
Jul 21, 2015, 4:10:15 PM7/21/15
to meetecho-janus
If you have created a Janus API session and attached the plugin, you should be making a rest request against <janus-server>/janus/<sessionid>/<pluginhandleid>

You need to have your plublishers specify an Id when they enter the room(this can be done on the join by setting the id property on your request). That is your best option as you can then accurately determine who is who when you are grabbing feeds. 

The other option for getting the publisher id is to query the room to grab the publishers that are currently there...not sure off of the top of my head how to do that but it is possible

Lorenzo Miniero

unread,
Jul 22, 2015, 1:58:37 AM7/22/15
to meetecho-janus, danife...@gmail.com
Daniel,

I think you have a bit of studying to do first :-)

http://janus.conf.meetecho.com/docs/rest.html

L.

Daniel Roviriego

unread,
Jul 22, 2015, 3:59:45 PM7/22/15
to meetecho-janus
thanks Benjamin and Lorenzo.

I am already studying the REST API..extremely necessary.
I had success returned from the rtp_forward (a bit difficult is finding out the publisher_ids frm the console bit is doable .

now I'm struggling with the gstreamer pipeline for a simple autovideosink (only video for the moment) , the pipeline starts PLAYING but i have no window with the video:

gst-launch-1.0 udpsrc port=5002 caps="application/x-rtp, media=video, payload=100, clock-rate=90000, encoding-name=VP8-DRAFT-IETF-01" ! rtpjitterbuffer do-lost=true latency=300 mode=0 ! rtpvp8depay ! queue ! vp8dec ! videoconvert ! videoscale ! videorate ! video/x-raw, format=RGB, width=640, height=480, pixel-aspect-ratio=1/1, interlace-mode=progressive, framerate=16/1 ! autovideosink

One other thing I'm  trying to guess the REST command for the listing  the published_ids from this file https://github.com/meetecho/janus-gateway/blob/master/plugins/janus_videoroom.c (from line 68 ) but it ś hard to guess.. you mention \c create, \c destroy.. how is this ?

anyway.. many thanks again! Janus rocks totally, I had 1000 crazy ideas out of it

Lorenzo Miniero

unread,
Jul 22, 2015, 4:04:31 PM7/22/15
to meetecho-janus, danife...@gmail.com
If you use a handle to join as a fake publisher (that is, one that joins the room but never publishes any media), you'll receive notifications about publishers coming and going.

L.

Benjamin Trent

unread,
Jul 22, 2015, 5:25:51 PM7/22/15
to meetecho-janus
Make sure your gstreamer version is the newest possible. Anything before 1.4 I had issues decoding VP8. Also, if you are just using an autovideosink, you may want to set its sync and async attirbutes to false. Again, mess with the raw video caps and make them anything you want(I am just using 1/16 for lower cpu utilization in other parts of my application and 1/15 is a never ending decimal that ends up getting rounded weirdly inside of c++).  

You can always connect a fakesink to the udpsrc element to make sure you are actually receiving the packets(launch in verbose mode and make sure that fakesink's silent attribute is false).

Lorenzo Miniero

unread,
Jul 23, 2015, 4:23:43 AM7/23/15
to meetecho-janus, ben.w...@gmail.com, ben.w...@gmail.com
Just out of curiosity, how are you handling the dynamic resolution WebRTC clients make use of? IIRC, gstreamer couldn't cope well with VP8 streams that had varying resolution: it basically chose the one it received first, and when the resolution changed it was either cropped or padded. That was with gstreamer versions much earlier than 1.4, though.

L.

Benjamin Trent

unread,
Jul 23, 2015, 8:44:10 AM7/23/15
to meetecho-janus, lmin...@gmail.com
With Gstreamer-1.4+(on linux at least), if you are simply pushing to a display sink(xvimagesink), the caps auto renegotiate without issues and there are no problems with the framesize changes. When you push the frames through a specific set of static caps the cap renegotiation between the Vp8Dec and the down stream is handled by gstreamer and the stream is transformed to match your static caps.
Reply all
Reply to author
Forward
0 new messages