unmute/mute events streaming plugin

133 views
Skip to first unread message

María

unread,
Nov 8, 2021, 7:30:22 AM11/8/21
to meetecho-janus
Hello reader,

I'm using the streming plugin to display the video captured from a smartglasses set in real time. The glasses send the captured video to a webapp (1) and, from there, I send the video using rtp_forward. On the other hand, I have another webapp (2) that "connects" to the stream and displays it in a video tag. The problem is that the first time I open this second webapp (once the rtp_forward is already on going) janus fires a "mute" event by default and the stream is not displayed. If I wait long enough an "unmute" event fires and the stream appears. Is there a way to avoid this annoying event to be fired from the begging? Why does it happen?
If I provoque a reconnection from the webapp that publishes the stream (1) the stream appears instantly in webapp 2. This means, if I open 2 before the rtp_forward starts and wait for rtp_forward to start, the mute by default doesn't happen.

Hope I explained myself. Please let me know if any clarifications are needed. Thanks!

Lorenzo Miniero

unread,
Nov 8, 2021, 8:08:32 AM11/8/21
to meetecho-janus
I think you just need to configure RTCP in both RTP Forwarder and Streaming plugin mountpoint, as otherwise keyframe requests (needed for new viewers) are not exchanged. You can refer to the documentation of both plugins for more details.

L.

María

unread,
Nov 8, 2021, 10:14:23 AM11/8/21
to meetecho-janus
Hello Lorenzo,

One curious observation. The mountpoint sends audio and video, the audio is actually received instantly and is the video track what is being "muted". Is that possible? 

Thanks for the early response :)

PD: I tryed your suggestion and it didn't solve the issue. 
For creating the streming mounpoint I'm using this message structure:
var stream1 = { "request": "create", "type": "rtp", "id":parseInt(room), "audio": true, "video": true, "pin": pin, "audioport": audioport1 , "audiopt": 111, "audiortpmap": "opus/48000/2", "videoport": videoport, "videopt": 100, "videortpmap": "VP8/90000", "videortcpport":4004};

And for the rtp_forward:
var requestmsg = { "request": "rtp_forward", "publisher_id" : parseInt(myid),  "room": parseInt(room), "audio_port": audioport1, "audiopt": 111, "video_port": videoport, "videopt": 100, "host":"", "local_rtcp_port":4004, "remote_rtcp_port":4005};

I wrote it based on the documentation and the API doesn't complain... Still I'm not sure it's correct because for the rtp_forward I cannot distinguish between audio and video rtcp ports (I only included the videortcpport in the streaming mountpoint because is the problematic track)

Lorenzo Miniero

unread,
Nov 8, 2021, 12:12:00 PM11/8/21
to meetecho-janus
You're using the RTP forwarding API wrong. Since videortcpport is 4004 in the Streaming plugin, then you need to set video_rtcp_port to 4004 when doing rtp_forward.

L.

María

unread,
Nov 9, 2021, 8:13:35 AM11/9/21
to meetecho-janus
It works now!! Thank you very much Lorenzo
Reply all
Reply to author
Forward
0 new messages