RTP to WebRTC

1,793 views
Skip to first unread message

simonv...@gmail.com

unread,
Jul 8, 2015, 5:58:14 AM7/8/15
to kur...@googlegroups.com
Good morning,

I'm trying to implement a one to one communication beetween an RTP client and a WebRTC client with Kurento. I started with the one2many tutorial (in Java) to transfer the RTP stream and following the many questions written by other users I succeded in streaming the video woth h264.

I used the following procedure (it could help to answer my question and help other people):

- the RTP client sends the following SDP offer to the server:
v=0\n
o=- 0 0 IN IP4 127.0.0.1
m=video 5000 RTP/AVP 96
c=IN IP4 127.0.0.1
a=sendonly
a=rtpmap:96 H264/90000

- the server creates an RTP endpoint and generate the SDP answer:
pipeline = kurento.createMediaPipeline();
rtpEndpoint = new RtpEndpoint.Builder(pipeline).build();


String sdpOffer = jsonMessage.getAsJsonPrimitive("sdpOffer").getAsString();
pipeline = kurento.createMediaPipeline();
rtpEndpoint = new RtpEndpoint.Builder(pipeline).build();              

String sdpAnswer = rtpEndpoint.processOffer(sdpOffer);

- the RTP client parses the answer, gets the host and PORT and starts transmitting (I'm using python, for this client):
sdpAnswer = result['sdpAnswer']
m = re.search('m=video (\d+)', sdpAnswer)
videoPort = m.group(1)
m = re.search('IP4 ([\d.]+)', sdpAnswer)
host = m.group(1)

- RTP client starts streaming video using the following Gstreamer pipeline:
'gst-launch-1.0 v4l2src ! queue ! videorate ! videoconvert ! video/x-raw,width=1280,height=720,framerate=30/1 ! x264enc tune=zerolatency ! rtph264pay ! udpsink host='+host+' port='+videoPort

- Finally, when the WebRTC connects to the server (it has the only role of viewer now), I connect the two endpoints:
String sdpOffer = jsonMessage.getAsJsonPrimitive("sdpOffer").getAsString();

WebRtcEndpoint nextWebRtc = new WebRtcEndpoint.Builder(pipeline).build();
rtpEndpoint.connect(nextWebRtc);
String sdpAnswer = nextWebRtc.processOffer(sdpOffer);

the rest of the code is pretty much the same of the tutorial. 
Until here it works fine. My problem is that I want to transmit also the audio source but I'm stuck and I don't understand where the problem is.

I added the following lines to the SDP offer that RTP client sends:
m=audio 58704 RTP/SAVPF 111
a=rtpmap:111 opus/48000/2
a=mid:audio
a=sendonly
c=IN IP4 127.0.0.1

and I added the following to the gstreamer pipeline:
 '\ alsasrc device=hw:3,0 ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host='+host+' port='+audioPort

If I well understood, it should start transmitting the audio source to kms as it did for the video, but on the WebRTC client I don't receive anything. I also tried to change the opus codec with PCMU but in this case, other than not receiving any sound, the video stream becomes very very slow and it plays not at a constant speed (i.e. sometimes it stays still for a long time, then it plays for some seconds, etc...). Am I doing something wrong? Is there an error in the sdpOffer or the gstreamer pipeline? Or am I missing something else?

Other information: the audio could come from the same or a different device. I tried the microphone in localhost with alsa and gstreamer and it works, so it's not a driver/hw problem.

Thank you for your time :)


simonv...@gmail.com

unread,
Jul 14, 2015, 9:30:47 AM7/14/15
to kur...@googlegroups.com, simonv...@gmail.com
Hi, 

I finally managed to send also the audio source using the PCMU encoder. I report here the SDP offer and the gstreamer pipeline I used if someone needs them:
v=0
o=- 0 0 IN IP4 37.182.24.167
c=IN IP4 37.182.24.167
t=0 0
m=audio 5005 RTP/AVP 0
a=rtpmap:0 PCMU/8000
m=video 5000 RTP/AVP 96
a=rtpmap:96 H264/90000

gst-launch-1.0 v4l2src device=/dev/video0  ! queue ! videorate ! videoconvert ! video/x-raw,width=1280,height=720,framerate=30/1 ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=46.16.189.148 port=51978 ts-offset=0 \ alsasrc device=hw:3,0 ! audioconvert ! audioresample !  mulawenc ! rtppcmupay ! udpsink host=46.16.189.148 port=55062 ts-offset=0 

I discovered an interesting thing that I think is a kurento bug: If you swap the audio and video lines in the SDP offer, kurento doesn't send the audio anymore! The SDP answer received is the same, but for some reason the browser doesn't receive the audio if "m=video 5000 RTP/AVP 96 a=rtpmap:96 H264/90000" is before "m=audio 5005 RTP/AVP 0 a=rtpmap:0 PCMU/8000".

I have two other problems now:
- I can't receive the stream sent from the browser to gstreamer
- After the end of a call I have to refresh the browser webpage otherwise it's not able to start a new connection.

Does anyone have the same problems?

Ivan Gracia

unread,
Jul 16, 2015, 9:12:21 AM7/16/15
to Kurento Public, simonv...@gmail.com

Hey! Thanks a lot for this post. I’m sure it will be really helpful, as many others are trying that. Maybe your python solution could help them!

Just one thing. Did you try to have the rtp in KMS process the answer? I mean, something like rtp.processOffer(<you_mangled_sdp>). That should have the RTP listening in that port.


Ivan Gracia



--
You received this message because you are subscribed to the Google Groups "kurento" group.
To unsubscribe from this group and stop receiving emails from it, send an email to kurento+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Message has been deleted

sim...@gmail.com

unread,
Jul 21, 2015, 10:05:49 AM7/21/15
to kur...@googlegroups.com, simonv...@gmail.com

Hi, I attached the basic python script removing the part specific of my application, I hope it will help :) 

And yes, I call rtp.processOffer(...) in my server code and I also connected the RTP and webRTC endpoints in both directions. If I well understood I should receive the audio stream on port 5003 and video on port 5000 (accordingly with my sdpOffer) but it doesn't work: I also tried a very simple gstreamer pipeline with fakesink but I don't receive anything. 

One question: is it a problem if I call rtp.processOffer(...) before the two endpoints are connected? In my application the RTP client is always the first one to establish the communication. The connection between the two endpoints is made only when the WebRTC client establish the communication too.

SV
kurento-demo.py

Ivan Gracia

unread,
Jul 22, 2015, 9:39:32 PM7/22/15
to Kurento Public, simonv...@gmail.com
Sorry, where it says processOffer, I wanted to write processAnswer. That should tell the RTP endpoint in KMS where it should go and get the video from, as when you invoke processOffer, what you're stating is where the RTP endpoint in KMS has to send the video to.

Ivan Gracia



--
Reply all
Reply to author
Forward
0 new messages