Chrome 37: Detect Remote MediaStream has ended

1,426 views
Skip to first unread message

Olivier Anguenot

unread,
Sep 12, 2014, 7:54:12 AM9/12/14
to discuss...@googlegroups.com
Hello,

I try the following: Chrome "A" (camera) calls Chrome "B" (camera).

All works fine: Both videos are rendered on each side

But I want to stop the call on "A" side, so I use MediaStream.stop() on "A" side

On "A" side, I detect (using listeners) that:
- "A" Audio track has ended
- "A" Video track has ended
- "A" MediaStream has ended

But on "B" side, I see that, there is no more bits received for each track, but it seems that the events Track.onended and MediaStream.onended  are not fired ?

Is it the right method to detect that video from the remote peer has stopped (without having to use the signaling layer) ?

Or, should I remove the stream from the peerConnection too  so that on "B" side, other events are fired (onaddstream) ?

Regarding WebRTC internals, I don't find information about the track "ended" state.

Thanks in advance,
Olivier

Philipp Hancke

unread,
Sep 12, 2014, 8:30:40 AM9/12/14
to discuss...@googlegroups.com
2014-09-12 13:54 GMT+02:00 Olivier Anguenot <oang...@gmail.com>:
Hello,

I try the following: Chrome "A" (camera) calls Chrome "B" (camera).

All works fine: Both videos are rendered on each side

But I want to stop the call on "A" side, so I use MediaStream.stop() on "A" side

If you want to stop the call, why don't you close the peerconnection?
 
On "A" side, I detect (using listeners) that:
- "A" Audio track has ended
- "A" Video track has ended
- "A" MediaStream has ended

But on "B" side, I see that, there is no more bits received for each track, but it seems that the events Track.onended and MediaStream.onended  are not fired ?

I think they will only be fired when the respective a=ssrc lines have been removed in the SDP.
 
Is it the right method to detect that video from the remote peer has stopped (without having to use the signaling layer) ?

Or, should I remove the stream from the peerConnection too  so that on "B" side, other events are fired (onaddstream) ?

Removing the stream on the A side should trigger the onnegotiationneeded callback. 
Create a new offer there, send it to the peer, do the setremotedescription/setlocaldescription dance. Wait for onremovestream at B.
 

Iñaki Baz Castillo

unread,
Sep 12, 2014, 8:51:07 AM9/12/14
to discuss...@googlegroups.com
2014-09-12 14:30 GMT+02:00 Philipp Hancke <philipp...@googlemail.com>:
>> But on "B" side, I see that, there is no more bits received for each
>> track, but it seems that the events Track.onended and MediaStream.onended
>> are not fired ?
>
>
> I think they will only be fired when the respective a=ssrc lines have been
> removed in the SDP.


Please try in Chrome Canary. The Track.onended should fire.

--
Iñaki Baz Castillo
<i...@aliax.net>

Philipp Hancke

unread,
Sep 12, 2014, 8:54:06 AM9/12/14
to discuss...@googlegroups.com
On the remote side?

Olivier Anguenot

unread,
Sep 12, 2014, 10:05:57 AM9/12/14
to discuss...@googlegroups.com
HI,

I tested using a chome canary (version 39) on both side.

It's different, I received an event from the Track.onmute listener corresponding to the video track

But:
- Audio track readyState = 'live'
- Video track readyState = 'muted'
- MediaStream ended: false

Olivier

Olivier Anguenot

unread,
Sep 12, 2014, 10:07:34 AM9/12/14
to discuss...@googlegroups.com
I don't want to stop the PeerConnection, because there is a datachannel too in that peerConnection

Olivier

Harald Alvestrand

unread,
Sep 12, 2014, 10:09:49 AM9/12/14
to discuss...@googlegroups.com
Did you get the negotiationneeded event on the A side?

Did you do the SDP exchange?

At the moment, the B-side ended event is defined to be triggered by the SDP exchange - just because the bits stopped arriving doesn't tell you that the stream is ended.
(There's a function in RTP called a BYE packet, but it's not a 100% reliable mechanism. I don't think we use it.)



--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Olivier Anguenot

unread,
Sep 12, 2014, 10:28:30 AM9/12/14
to discuss...@googlegroups.com
Hi Harald,

I don't have negotiationneeded event on A side.

I have understood that this event is triggered when stream is added or removed or when constraints on already added stream change.

But in that case, I just call a MediaStream.stop ?

I will try by stopping each track instead

Iñaki Baz Castillo

unread,
Sep 14, 2014, 4:36:29 PM9/14/14
to discuss...@googlegroups.com
2014-09-12 14:49 GMT+02:00 Iñaki Baz Castillo <i...@aliax.net>:
> Please try in Chrome Canary. The Track.onended should fire.


Sorry, I meant the Track.unmute event. That works in Chrome Canary
when the remote stops sending RTP for that track.

pablo platt

unread,
Sep 14, 2014, 7:35:12 PM9/14/14
to discuss...@googlegroups.com
In my app the onaddstream, onremovestream, onaddtrack and onremovetrack callbacks are not reliable.
I'm building audio/video conference where each user create a single peer connection and connects to MCU.
The server adds and removes streams and tracks when participants start and stop their camera and microphone.

Let's say a user has audio stream and he starts and stops his camera several times.
Other participants should see the onaddtrack and onremovetrack callbacks being called.
In my tests, it works several times but than the callbacks just stop firing.

It could really help if there where tests and demo for this scenario.
If you'll add the ability to stop and start the mic and camera in the apprtc demo, I'm sure you'll discover several bugs.

Harald Alvestrand

unread,
Sep 15, 2014, 3:15:05 AM9/15/14
to discuss...@googlegroups.com
Pablo, it seems that this could be done with some simple mods of the present demos (which are far simpler to hack on than the apprtc application).

They're on github at ssh:g...@github.com:GoogleChrome/webrtc.git - if you can demonstrate the error you're seeing, I'm sure a pull request won't be taken amiss.

Philipp Hancke

unread,
Sep 15, 2014, 4:51:44 AM9/15/14
to discuss...@googlegroups.com
2014-09-15 1:35 GMT+02:00 pablo platt <pablo...@gmail.com>:
In my app the onaddstream, onremovestream, onaddtrack and onremovetrack callbacks are not reliable.
I'm building audio/video conference where each user create a single peer connection and connects to MCU.
The server adds and removes streams and tracks when participants start and stop their camera and microphone.

Let's say a user has audio stream and he starts and stops his camera several times.

How are you stopping the camera? Removing the stream and renegotiating?
 
Other participants should see the onaddtrack and onremovetrack callbacks being called.

You are adding/removing the respective a=ssrc:...msid lines for the tracks?
Are onaddstream / onremovestream called the way you would expect?
 
In my tests, it works several times but than the callbacks just stop firing.

I suppose using multiple tracks in the same stream still has some issues (see e.g. issue 3587).
 
Do you really need multiple tracks or would multiple streams be sufficient? That works pretty reliable in my experience.

pablo platt

unread,
Sep 15, 2014, 8:29:41 AM9/15/14
to discuss...@googlegroups.com
On Mon, Sep 15, 2014 at 11:51 AM, Philipp Hancke <philipp...@googlemail.com> wrote:


2014-09-15 1:35 GMT+02:00 pablo platt <pablo...@gmail.com>:
In my app the onaddstream, onremovestream, onaddtrack and onremovetrack callbacks are not reliable.
I'm building audio/video conference where each user create a single peer connection and connects to MCU.
The server adds and removes streams and tracks when participants start and stop their camera and microphone.

Let's say a user has audio stream and he starts and stops his camera several times.

How are you stopping the camera? Removing the stream and renegotiating?

If the user has mic active, I'm removing the track pc.removeTrack(pc.getVideoTracks()[0]) and renegotiating.
The onnegotiationneeded isn't called in this case. bug?
If the user doesn't have a mic active, I'm removing the stream.
 
 
Other participants should see the onaddtrack and onremovetrack callbacks being called.

You are adding/removing the respective a=ssrc:...msid lines for the tracks?
Are onaddstream / onremovestream called the way you would expect?

On the MCU I'm removing the a=ssrc...msid lines.
On the client I expect Chrome to remove them automatically for me.

 
 
In my tests, it works several times but than the callbacks just stop firing.

I suppose using multiple tracks in the same stream still has some issues (see e.g. issue 3587).

That's why I suggested adding tests and a real demo that will reveal them.
I'm surprised  other users don't need this feature.

 
Do you really need multiple tracks or would multiple streams be sufficient? That works pretty reliable in my experience.

Do you mean one stream for the mic and one stream for the cam?

Iñaki Baz Castillo

unread,
Sep 15, 2014, 9:25:30 AM9/15/14
to discuss...@googlegroups.com
2014-09-15 10:51 GMT+02:00 Philipp Hancke <philipp...@googlemail.com>:
> You are adding/removing the respective a=ssrc:...msid lines for the tracks?

That is not the way to go in Plan-Unified in which each track MUST be
defined by its own "m" line.

pablo platt

unread,
Sep 15, 2014, 11:03:13 AM9/15/14
to discuss...@googlegroups.com
On Mon, Sep 15, 2014 at 4:25 PM, Iñaki Baz Castillo <i...@aliax.net> wrote:
2014-09-15 10:51 GMT+02:00 Philipp Hancke <philipp...@googlemail.com>:
> You are adding/removing the respective a=ssrc:...msid lines for the tracks?

That is not the way to go in Plan-Unified in which each track MUST be
defined by its own "m" line.

Is there a working example with multiple audio/video streams in a single peer connection?
Plan-Unified is what Chrome and other implementations use? Is it a standard?
 


--
Iñaki Baz Castillo
<i...@aliax.net>

Iñaki Baz Castillo

unread,
Sep 15, 2014, 1:50:55 PM9/15/14
to discuss...@googlegroups.com
2014-09-15 17:03 GMT+02:00 pablo platt <pablo...@gmail.com>:
> Is there a working example with multiple audio/video streams in a single
> peer connection?
> Plan-Unified is what Chrome and other implementations use? Is it a standard?

Plan-Unified is supposed to be the standard way for multiple tracks.
It is not clear that it is yet implemented (not in Firefox or Chrome).
In fact, current Hangouts-WebRTC (which just works on Chorme) still
uses multiple a=ssrc per "m" line. I've tried to use Plan-Unified in
Chrome without success:

https://code.google.com/p/chromium/issues/detail?id=414202

Olivier Anguenot

unread,
Sep 15, 2014, 2:21:03 PM9/15/14
to discuss...@googlegroups.com
I'm not really in line with proposals given in that thread.

As a Web developer, today, I would like to rely only on the APIs defined by the standardization group.
If I have to deal with the SDP, building a webRTC application will be too complicated.

So, my understanding is that the event stream.onended is not fired (local side) when the remote peer calls stream.stop().

But I don't know if it's a bug or not.

Because, I try to find a solution to detect that the remote stream has ended without having to use my own signaling layer to send a "endcall" message

Iñaki Baz Castillo

unread,
Sep 15, 2014, 2:36:12 PM9/15/14
to discuss...@googlegroups.com
2014-09-15 20:21 GMT+02:00 Olivier Anguenot <oang...@gmail.com>:
> So, my understanding is that the event stream.onended is not fired (local
> side) when the remote peer calls stream.stop().
>
> But I don't know if it's a bug or not.
>
> Because, I try to find a solution to detect that the remote stream has ended
> without having to use my own signaling layer to send a "endcall" message

The PC.onaddstream is fired when the remote SDP is entered into the
PeerConnection, so the same for MediaStreamTrack.onended. It will be
fired when a SDP renegotiation takes place and the new remote SDP does
not longer include that track.

Harald Alvestrand

unread,
Sep 15, 2014, 4:02:35 PM9/15/14
to discuss...@googlegroups.com
On Mon, Sep 15, 2014 at 11:21 AM, Olivier Anguenot <oang...@gmail.com> wrote:
I'm not really in line with proposals given in that thread.

As a Web developer, today, I would like to rely only on the APIs defined by the standardization group.
If I have to deal with the SDP, building a webRTC application will be too complicated.

So, my understanding is that the event stream.onended is not fired (local side) when the remote peer calls stream.stop().

But I don't know if it's a bug or not.


It's a bug.

 
Because, I try to find a solution to detect that the remote stream has ended without having to use my own signaling layer to send a "endcall" message


From draft-ietf-mmusic-msid:

   o  When a description is updated to no longer list the msid attribute
      on a specific media description, the recipient can signal to its
      application that the corresponding MediaStreamTrack has ended.

   In addition to signaling that the track is closed when its msid
   attribute disappears from the SDP, the track will also be signaled as
   being closed when all associated SSRCs have disappeared by the rules
   of [RFC3550] section 6.3.4 (BYE packet received) and 6.3.5 (timeout),
   and when the corresponding media section is disabled by setting the
   port number to zero.  Changing the direction of the media section to
   "recvonly" will not close the MediaStreamTrack.

I think the first bug is that "negotiationneded" is not fired on the sending side when a track ends.

pablo platt

unread,
Sep 15, 2014, 6:52:32 PM9/15/14
to discuss...@googlegroups.com
Changing the direction of the media section to "recvonly" will not close the MediaStreamTrack.

If the remote peer removes the track with localStream.removeTrack(localStream.getVideoTracks()[0])
and the media line becomes 'recvonly' instead of 'sendrecv', the local peer won't get onaddtrack event?
Reply all
Reply to author
Forward
0 new messages