I'm have a rather special use case for Janus Gateway. I need to transfer
bidirectional audio to and from an external application running on a server,
so I'm currently using the following setup:
First WebRTC connection (receive audio only) - Audio received by client browser:
A. Browser with Janus JS client receiving audio
B. Internet
C. Janus Streaming plugin receiving RTP UDP stream
D. GStreamer pipeline encoding Opus audio in RTP stream, captured from sound card
Second WebRTC connection (sent audio only) - Audio sent by client browser:
A. Browser with Janus JS client sending audio
B. Internet
C. Janus Audiobridge plugin set to statically forward RTP UDP packets
D. GStreamer pipeline decoding Opus audio from RTP stream and playing it back on sound card
So basically the web application running in the client browser plays back
what the remote sound card captures and plays back audio captured on
the client browser on the remote sound card.
This has been working very reliably on both Firefox and Chrome on all platforms,
including Android. However, using two WebRTC connections is not optimal and
it seems this is a blocker for getting bidirectional audio working properly
on Safari 12 (at least on iOS 12, e.g. on iPhone/iPad). It looks like every time
a new WebRTC connection is established, Safari somehow mutes any existing
WebRTC streams, so that _either_ receiving or sending of audio works.
I've also found similar bug reports about calls to getUserMedia()
muting existing streams, but I'm experiencing similar behavior even with
a single getUserMedia() call before establishing any WebRTC connections.
Here's an interesting article describing Safari WebRTC issues:
https://webrtchacks.com/guide-to-safari-webrtc/Finally to the questions:
1. Have you been testing Janus on Mobile Safari (preferably on latest iOS 12)
and if bidirectional audio is supposed to work properly?
2. It seems Janus does not currently ship with a plugin that would allow me
to have a single WebRTC connection for both sending and receiving audio
to/from an external application via an RTP stream, so:
2a. Is it somehow possible to configure Janus to route audio from a
single WebRTC connection to these two separate plugins:
Streaming plugin and Audiobridge?
2b. Would the new LUA plugin allow easy creation of a plugin that
would allow bidirectional RTP streams with an external application?
Could you give pointers on where to start with this?
Thanks!