Jitter and Packet Loss

362 views
Skip to first unread message

Matt Knowles

unread,
Aug 12, 2024, 4:58:43 PM8/12/24
to discuss-webrtc
We have a game streaming platform where we run the application (unity and unreal) on an AWS EC2 machine and stream to client devices (pc / mobile). I put this here as our use case is different from a conference call.

We have noticed on some streams we get run away jitter after we get packet loss.
I have attached a graph from our monitoring tooling to demonstrate an example.

We are using M116 currently and just updated to M121. 

  1. Is this something we should address by dynamically adjusting settings or are there setting we should be using when first creating our streams which libWebrtc should address this?
  2. We have exposed playout Delay min and max it is not clear on what values make sense to set these to.
    1. What logic / calculations do you use to set the playout Delay (min/max) when first creating a stream?
    2. Should this be adjusted at runtime? 
    3. Assuming runtime adjustements what logic / calculations do you use to adjust playout Delay (min/max)

Thank you,
Matt





runaway_jitter_graph.jpg

Tao Meme

unread,
Aug 12, 2024, 9:54:05 PM8/12/24
to discuss...@googlegroups.com
Cloud Gaming running on webRTC should set playout delay to 0 、disable pacer on sendr side and disable frame buffer on receiver side. By the way, the gcc bandwidth estimator is very terrible.

Matt Knowles <matt.k...@level-ex.com> 于2024年8月13日周二 04:58写道:
--
This list falls under the WebRTC Code of Conduct - https://webrtc.org/support/code-of-conduct.
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/66996e44-6a67-4529-a29b-7bdef1bafc54n%40googlegroups.com.

Mikko Koivisto

unread,
Aug 16, 2024, 6:35:41 AM8/16/24
to discuss-webrtc
I might have had somewhat similar issues when streaming android desktop over webRTC. See https://groups.google.com/g/discuss-webrtc/c/agps4DIcsxk/m/lOFq8AY5AAAJ

One thing that helped was setting minimum bandwidth high enough. That did not require any changes to webRTC. Tweaking pacer and playout delays sounds good. I would be interested in your experiments with those.

Matt Knowles

unread,
Aug 16, 2024, 6:35:47 AM8/16/24
to discuss-webrtc
1. Yeah most of time we are going with 0 playout delay, some of our sessions are struggling due to network issues so increasing playout delay has helped stablize the client side frame rate.

       Is there a specific way or setting to disable it?

3. Our clients are using standard browsers so I assume we don't have the ability to disable the frame buffer on the client.

4. On the 'gcc' side of things, recommendations? Is there an alternative?

Appreciate the feedback.

Message has been deleted

lidedongsn

unread,
Aug 19, 2024, 10:27:18 PM8/19/24
to discuss-webrtc
Have you tested this parameter - jitterBufferTargethttps://chromestatus.com/feature/5930772496384000
I'm not sure this helps. 
I think it's similar to playout delay.
rtpReceiver.jitterBufferTarget = jitterBufferTarget
// rtpReceiver.playoutDelayHint = playoutDelayHint;

Looking forward to your feedback.
Reply all
Reply to author
Forward
0 new messages