Hi WebRTC experts,
Our team is currently implementing simulcast streaming (typically with 3 layers: low, mid, high) on Mobile(iOS, Android) clients.
We've observed that even under excellent network conditions, the lower layers (low, mid) often seem capped at their configured targetBitrate (which appears to be around 75-80% of their maxBitrate in our setup), while the highest active layer (high) utilizes the remaining bandwidth to reach its configured maxBitrate.
Upon investigating the libwebrtc code, we believe this behavior stems from the logic within SimulcastRateAllocator::DistributeAllocationToSimulcastLayers. Specifically, after allocating up to targetBitrate for each active layer, the remaining available bandwidth (left_in_total_allocation) is only added to the top_active_layer to potentially reach its maxBitrate. Currently we are using the M115 build, but the code seems to be identical in the latest version.
We are considering modifying this logic to allow all active layers (low, mid, and high) to utilize the available excess bandwidth to potentially reach their respective maxBitrate limits, assuming the total estimated bandwidth allows for it. The idea would be to distribute the left_in_total_allocation more evenly or proportionally across layers, up to their individual maxBitrate caps.
However, the TODO comment explicitly mentions potential "performance implications". This makes us hesitant to proceed without fully understanding the potential downsides.
TODO comment code ref: https://chromium.googlesource.com/external/webrtc/+/refs/heads/master/modules/video_coding/utility/simulcast_rate_allocator.cc#200
Could the community shed some light on:
We want to ensure that enabling higher bitrates on lower layers doesn't inadvertently degrade the overall streaming quality or stability. Any insights, experiences, or advice on this matter would be greatly appreciated.
Thanks.
We are setting the maxBitrate as you suggested. However, according to the simulcast bitrate allocation logic, only the highest layer is allocated up to maxBitrate, while the lower layers are allocated only up to targetBitrate (which is set to 3/4 of maxBitrate).
It appears that targetBitrate being determined as 3/4 of maxBitrate follows the logic in the code below. We couldn't separately set the targetBitrate in the mobile SDK, so we are only setting the min/max bitrate.
For example, if we set maxBitrate for each ladder as 100kbps for low, 1000kbps for mid, and 2000kbps for high, and if the currently available bitrate is sufficient, then the bitrates allocated to low/mid/high would be 75/750/2000kbps.
In this case, if we want to maintain the existing simulcast bitrate allocation logic and want the mid layer to be transmitted at 1000kbps while transmitting the high layer, we would need to set the maxBitrate of the mid layer to something like 1333kbps (1000 * 4/3). However, this creates a problem where, in situations where we cannot transmit the High layer, the mid layer becomes the highest layer and could be allocated up to 1333kbps.
That's why I was trying to change the simulcast bitrate allocation logic.
Thank you for your quick response. We will discuss within the team how to resolve this issue.
Thank you.