Question regarding SimulcastRateAllocator logic: Allowing all layers to utilize maxBitrate

123 views
Skip to first unread message

floret _

unread,
Apr 17, 2025, 4:51:46 AM4/17/25
to discuss-webrtc

Hi WebRTC experts,

Our team is currently implementing simulcast streaming (typically with 3 layers: low, mid, high) on Mobile(iOS, Android) clients.


We've observed that even under excellent network conditions, the lower layers (low, mid) often seem capped at their configured targetBitrate (which appears to be around 75-80% of their maxBitrate in our setup), while the highest active layer (high) utilizes the remaining bandwidth to reach its configured maxBitrate.


Upon investigating the libwebrtc code, we believe this behavior stems from the logic within SimulcastRateAllocator::DistributeAllocationToSimulcastLayers. Specifically, after allocating up to targetBitrate for each active layer, the remaining available bandwidth (left_in_total_allocation) is only added to the top_active_layer to potentially reach its maxBitrate. Currently we are using the M115 build, but the code seems to be identical in the latest version.



We are considering modifying this logic to allow all active layers (low, mid, and high) to utilize the available excess bandwidth to potentially reach their respective maxBitrate limits, assuming the total estimated bandwidth allows for it. The idea would be to distribute the left_in_total_allocation more evenly or proportionally across layers, up to their individual maxBitrate caps.


However, the TODO comment explicitly mentions potential "performance implications". This makes us hesitant to proceed without fully understanding the potential downsides.


TODO comment code ref: https://chromium.googlesource.com/external/webrtc/+/refs/heads/master/modules/video_coding/utility/simulcast_rate_allocator.cc#200


Could the community shed some light on:

  1. What specific "performance implications" might arise from allowing lower simulcast layers to also push towards their maxBitrate when bandwidth is plentiful?
    • Could it lead to encoder overload, especially on mobile devices (Android)?
    • Might it negatively impact the stability or accuracy of bandwidth estimation (BWE)?
    • Could it cause unforeseen issues with quality fluctuations or interactions between layers?
    • Are there potential receiver-side challenges associated with this change?
  2. What was the original rationale behind limiting the "push to maxBitrate" behavior only to the top active layer? Was it primarily CPU concerns, BWE stability, perceived quality benefits, or something else?
  3. Has anyone experimented with or implemented such a modification? If so, what were the results or lessons learned?

We want to ensure that enabling higher bitrates on lower layers doesn't inadvertently degrade the overall streaming quality or stability. Any insights, experiences, or advice on this matter would be greatly appreciated.


Thanks.

spr...@webrtc.org

unread,
Apr 17, 2025, 5:51:02 AM4/17/25
to discuss-webrtc
The original design rationale has likely been lost to history, but the performance implications mentioned were mostly related to increased bandwidth usage, and the potential that a subset of bandwidth constrained users would have to drop down to a lower resolution if the target bitrate went up higher than their available bandwidth.
I would advice against changing the default behavior - however, to work around your problem (iirc) you can manually configure the bitrates for the simulcast layers using e.g. https://developer.mozilla.org/en-US/docs/Web/API/RTCRtpSender/setParameters#maxbitrate - and that bitrate should be applied even if higher layers are enabled.

Cheers,
Erik

floret _

unread,
Apr 21, 2025, 1:02:17 AM4/21/25
to discuss-webrtc

We are setting the maxBitrate as you suggested. However, according to the simulcast bitrate allocation logic, only the highest layer is allocated up to maxBitrate, while the lower layers are allocated only up to targetBitrate (which is set to 3/4 of maxBitrate).

It appears that targetBitrate being determined as 3/4 of maxBitrate follows the logic in the code below. We couldn't separately set the targetBitrate in the mobile SDK, so we are only setting the min/max bitrate.

https://chromium.googlesource.com/external/webrtc/+/refs/heads/master/video/config/encoder_stream_factory.cc#176

For example, if we set maxBitrate for each ladder as 100kbps for low, 1000kbps for mid, and 2000kbps for high, and if the currently available bitrate is sufficient, then the bitrates allocated to low/mid/high would be 75/750/2000kbps.

In this case, if we want to maintain the existing simulcast bitrate allocation logic and want the mid layer to be transmitted at 1000kbps while transmitting the high layer, we would need to set the maxBitrate of the mid layer to something like 1333kbps (1000 * 4/3). However, this creates a problem where, in situations where we cannot transmit the High layer, the mid layer becomes the highest layer and could be allocated up to 1333kbps.

That's why I was trying to change the simulcast bitrate allocation logic. 

Thank you for your quick response. We will discuss within the team how to resolve this issue. 

Thank you.


2025년 4월 17일 목요일 오후 6시 51분 2초 UTC+9에 spr...@webrtc.org님이 작성:
Reply all
Reply to author
Forward
0 new messages