Bidirectional WebRTC RTP streams with external applications

491 views
Skip to first unread message

Mikael Nousiainen

unread,
Nov 13, 2018, 11:35:08 AM11/13/18
to meetecho-janus
I'm have a rather special use case for Janus Gateway. I need to transfer
bidirectional audio to and from an external application running on a server,
so I'm currently using the following setup:

First WebRTC connection (receive audio only) - Audio received by client browser:
A. Browser with Janus JS client receiving audio
B. Internet
C. Janus Streaming plugin receiving RTP UDP stream
D. GStreamer pipeline encoding Opus audio in RTP stream, captured from sound card

Second WebRTC connection (sent audio only) - Audio sent by client browser:
A. Browser with Janus JS client sending audio
B. Internet
C. Janus Audiobridge plugin set to statically forward RTP UDP packets
D. GStreamer pipeline decoding Opus audio from RTP stream and playing it back on sound card

So basically the web application running in the client browser plays back
what the remote sound card captures and plays back audio captured on
the client browser on the remote sound card.

This has been working very reliably on both Firefox and Chrome on all platforms,
including Android. However, using two WebRTC connections is not optimal and
it seems this is a blocker for getting bidirectional audio working properly
on Safari 12 (at least on iOS 12, e.g. on iPhone/iPad). It looks like every time
a new WebRTC connection is established, Safari somehow mutes any existing
WebRTC streams, so that _either_ receiving or sending of audio works.

I've also found similar bug reports about calls to getUserMedia()
muting existing streams, but I'm experiencing similar behavior even with
a single getUserMedia() call before establishing any WebRTC connections.

Here's an interesting article describing Safari WebRTC issues:
https://webrtchacks.com/guide-to-safari-webrtc/

Finally to the questions:

1. Have you been testing Janus on Mobile Safari (preferably on latest iOS 12)
   and if bidirectional audio is supposed to work properly?

2. It seems Janus does not currently ship with a plugin that would allow me
   to have a single WebRTC connection for both sending and receiving audio
   to/from an external application via an RTP stream, so:

   2a. Is it somehow possible to configure Janus to route audio from a
       single WebRTC connection to these two separate plugins:
       Streaming plugin and Audiobridge?
  
   2b. Would the new LUA plugin allow easy creation of a plugin that
       would allow bidirectional RTP streams with an external application?
       Could you give pointers on where to start with this?

Thanks!

Lorenzo Miniero

unread,
Nov 14, 2018, 6:52:27 AM11/14/18
to meetecho-janus
You can't share a PeerConnection between different plugins, so if you want a single bidirectional PC that involves media coming and going from/to the outside, you'll need to write your own, maybe by putting pieces from one plugin into another. As it is, the Lua plugin won't help with that, because it doesn't have the low.level media stuff to do RTP in/out: you'd have to add the C code for that part.

Lorenzo
Reply all
Reply to author
Forward
0 new messages