I mostly have concerns over the use of default values here. I'm not sure it makes a lot of sense for different codecs' encoder delegates to handle default values differently. If I ask for a 50Kbps video in h264 vs av1, we _should_ probably get the same bitrate out of our encoders, whether or not we decide if there is a minimum.
if (current_params_.framerate == 0) {I'm not an expert on the encoder pipeline, but `current_params_` seems like something controlled by whatever JS call is attempting to do encoding. Does it not make sense to move default value handling to the VideoEncodeAccelerator::Config structure? Shouldn't all encoders have these same defaults, assuming they're good defaults to have? Maybe a user-supplied zero frame rate encode request _should_ fail?
// 100 kbps is a reasonable minimum for low-resolution video.
constexpr int kMinTargetBitrateKbps = 100;As far as I am aware, webcodecs allows these encoders to be used via some JS calls (@eugene, is this correct?). This seems like a very webrtc specific set of minimum values. It's conceivable that someone might truly want video under 100Kbps, what is the reasoning for deciding this is a "reasonable" minimum? Do other encoders have this same minimum?
// Log order_hint for AMD debuggingnit: Slightly unnecessary.
// Warm-up logging for AMD/Mesa debugging (crbug.com/471780477).ditto.
// Log DPB state after update for AMD debuggingditto.
// Log frame header parameters for AMD debuggingditto.
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
I mostly have concerns over the use of default values here. I'm not sure it makes a lot of sense for different codecs' encoder delegates to handle default values differently. If I ask for a 50Kbps video in h264 vs av1, we _should_ probably get the same bitrate out of our encoders, whether or not we decide if there is a minimum.
you're right that silently adjusting the bitrate is surprising. The issue is that AV1 on AMD/Mesa produces unusable output at very low bitrates, frames that decoders outright reject, not just low-quality video.
WebRTC sometimes sends zero/very-low bitrate during startup, and failing would break the stream entirely. so the 100 is just a "safe" min. and the log atleast gives a hint.
@tmath...@chromium.org - thanks alot for your, valid concerns, i am going to play around with it to see if we can get the defaults lower
if (current_params_.framerate == 0) {I'm not an expert on the encoder pipeline, but `current_params_` seems like something controlled by whatever JS call is attempting to do encoding. Does it not make sense to move default value handling to the VideoEncodeAccelerator::Config structure? Shouldn't all encoders have these same defaults, assuming they're good defaults to have? Maybe a user-supplied zero frame rate encode request _should_ fail?
VideoEncodeAccelerator already defines `kDefaultFramerate = 30` (https://source.chromium.org/chromium/chromium/src/+/main:media/video/video_encode_accelerator.h;l=231?q=kDefaultFramerate%20%3D%2030%20file:video_encode_accelerator.h&ss=chromium%2Fchromium%2Fsrc), so there's precedent for this default. I could change this to use that constant instead of a magic 30 to make it clearer this is an intentional fallback.
As for whether zero framerate should fail: in WebRTC scenarios, the framerate can legitimately be unknown at initialization time. Failing would break these use cases.
// 100 kbps is a reasonable minimum for low-resolution video.
constexpr int kMinTargetBitrateKbps = 100;As far as I am aware, webcodecs allows these encoders to be used via some JS calls (@eugene, is this correct?). This seems like a very webrtc specific set of minimum values. It's conceivable that someone might truly want video under 100Kbps, what is the reasoning for deciding this is a "reasonable" minimum? Do other encoders have this same minimum?
100Kbps was the value that made the pipeline stable in testing. The real issue is that WebRTC shows up with 0 bitrate during startup, which breaks the AV1 encoder on AMD/Mesa, it produces frames that decoders reject entirely.
I don't have strong evidence that 100Kbps is the "right" minimum vs something lower like 50Kbps. The goal was just to ensure a non-zero value gets to the encoder.
Happy to lower it to 50Kbps if that seems more reasonable for WebCodecs use cases.
Or alternatively, I could change the check to only kick in when bitrate is literally 0, rather than enforcing a minimum. WDYT?
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
Helmut JanuschkaI mostly have concerns over the use of default values here. I'm not sure it makes a lot of sense for different codecs' encoder delegates to handle default values differently. If I ask for a 50Kbps video in h264 vs av1, we _should_ probably get the same bitrate out of our encoders, whether or not we decide if there is a minimum.
you're right that silently adjusting the bitrate is surprising. The issue is that AV1 on AMD/Mesa produces unusable output at very low bitrates, frames that decoders outright reject, not just low-quality video.
WebRTC sometimes sends zero/very-low bitrate during startup, and failing would break the stream entirely. so the 100 is just a "safe" min. and the log atleast gives a hint.
It's my understanding that webrtc isn't the only user of these encoders. I took a cursory look through the bug, but wasn't able to find out what kind of very-low/non-zero framerates we're talking about, or where the threshold is for outputting crap frames.
Also, it might be good to update the commit message to mention that this CL is changing the meaning of 0FPS and 0Kbps in addition to the dpb frame fix.
if (current_params_.framerate == 0) {Helmut JanuschkaI'm not an expert on the encoder pipeline, but `current_params_` seems like something controlled by whatever JS call is attempting to do encoding. Does it not make sense to move default value handling to the VideoEncodeAccelerator::Config structure? Shouldn't all encoders have these same defaults, assuming they're good defaults to have? Maybe a user-supplied zero frame rate encode request _should_ fail?
VideoEncodeAccelerator already defines `kDefaultFramerate = 30` (https://source.chromium.org/chromium/chromium/src/+/main:media/video/video_encode_accelerator.h;l=231?q=kDefaultFramerate%20%3D%2030%20file:video_encode_accelerator.h&ss=chromium%2Fchromium%2Fsrc), so there's precedent for this default. I could change this to use that constant instead of a magic 30 to make it clearer this is an intentional fallback.
As for whether zero framerate should fail: in WebRTC scenarios, the framerate can legitimately be unknown at initialization time. Failing would break these use cases.
I suppose I am ok with treating 0 for framerate (and bitrate) as a "do whatever you think is best mr. encoder" message. Maybe it makes sense to ensure that they are both zero to enable this behavior? So if the caller requests 50Kbps and 0FPS, they get a legitimate error, since that makes no sense. If both are zero, then use defaults.
// 100 kbps is a reasonable minimum for low-resolution video.
constexpr int kMinTargetBitrateKbps = 100;Helmut JanuschkaAs far as I am aware, webcodecs allows these encoders to be used via some JS calls (@eugene, is this correct?). This seems like a very webrtc specific set of minimum values. It's conceivable that someone might truly want video under 100Kbps, what is the reasoning for deciding this is a "reasonable" minimum? Do other encoders have this same minimum?
100Kbps was the value that made the pipeline stable in testing. The real issue is that WebRTC shows up with 0 bitrate during startup, which breaks the AV1 encoder on AMD/Mesa, it produces frames that decoders reject entirely.
I don't have strong evidence that 100Kbps is the "right" minimum vs something lower like 50Kbps. The goal was just to ensure a non-zero value gets to the encoder.
Happy to lower it to 50Kbps if that seems more reasonable for WebCodecs use cases.
Or alternatively, I could change the check to only kick in when bitrate is literally 0, rather than enforcing a minimum. WDYT?
My gut feeling here is that "zero -> 100" is better than "max(value, 100)", since webrtc isn't the only thing using this file, and undocumented codec/platform specific combinations are going to be a debugging nightmare later. It a perfect world, this would be an optional<int>, so webrtc could legitimately say "give me whatever you think it should be", while still allowing anyone to explicitly set a value to whatever they'd like. Treating 0 as that optional is probably ok though, since it does not actually make sense to have a 0Kbps video.
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
@tmath...@chromium.org - changed to a "only use defaults if 0" approach, could you take a look if that would be more feasable and less limited/magic?
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
Looks mostly good, just remove some superfluous comments and fix up the signed int types.
The GPU process crashes (exit_code=8) when initializing the AV1 VA-API
encoder on AMD GPUs with Mesa drivers. This occurs because AMD's
stateless AV1 encoder driver is sensitive to the DPB (Decoded Picture
Buffer) state and may access reference frame slots even when encoding
a keyframe.
This CL also changes the handling of 0 framerate and 0 bitrate values:
- 0 is now treated as "use default" rather than enforcing a minimum
- Framerate defaults to VideoEncodeAccelerator::kDefaultFramerate (30fps)
- Bitrate defaults to 100 kbps
- Explicitly requested low values (e.g., 50 kbps) are now respected
This allows callers like WebRTC to signal "encoder's choice" when the
actual values are not yet known, while still supporting explicit low
bitrate/framerate configurations for WebCodecs use cases.I recommend just un-indenting this manually or the auto-formatter will do weird things to it.
constexpr int kDefaultBitrateKbps = 100;unsigned type
constexpr int kAmdWarmupFrameCount = 15; // ~0.5 seconds at 30fps for QP limitunsigned type
int target_bandwidth_kbps =uint32
int layer_bitrate_kbps = bitrate_sum / 1000;Change this (and bitrate_sum) to uint64_t. `bitrate_allocation.GetBitrateBps` already returns an unsigned 32 bit int, so there should never be a case where this is negative.
Then you can drop the " > 0" component in your expression below.
// Log order_hint for AMD debuggingremove
// Warm-up logging for AMD/Mesa debugging (crbug.com/471780477).remove.
// Log DPB state after update for AMD debuggingremove
// Log frame header parameters for AMD debuggingI mostly have concerns over the use of default values here. I'm not sure it makes a lot of sense for different codecs' encoder delegates to handle default values differently. If I ask for a 50Kbps video in h264 vs av1, we _should_ probably get the same bitrate out of our encoders, whether or not we decide if there is a minimum.
Ted (Chromium) Meyeryou're right that silently adjusting the bitrate is surprising. The issue is that AV1 on AMD/Mesa produces unusable output at very low bitrates, frames that decoders outright reject, not just low-quality video.
WebRTC sometimes sends zero/very-low bitrate during startup, and failing would break the stream entirely. so the 100 is just a "safe" min. and the log atleast gives a hint.
It's my understanding that webrtc isn't the only user of these encoders. I took a cursory look through the bug, but wasn't able to find out what kind of very-low/non-zero framerates we're talking about, or where the threshold is for outputting crap frames.
Also, it might be good to update the commit message to mention that this CL is changing the meaning of 0FPS and 0Kbps in addition to the dpb frame fix.
Done
The GPU process crashes (exit_code=8) when initializing the AV1 VA-API
encoder on AMD GPUs with Mesa drivers. This occurs because AMD's
stateless AV1 encoder driver is sensitive to the DPB (Decoded Picture
Buffer) state and may access reference frame slots even when encoding
a keyframe.
This CL also changes the handling of 0 framerate and 0 bitrate values:
- 0 is now treated as "use default" rather than enforcing a minimum
- Framerate defaults to VideoEncodeAccelerator::kDefaultFramerate (30fps)
- Bitrate defaults to 100 kbps
- Explicitly requested low values (e.g., 50 kbps) are now respected
This allows callers like WebRTC to signal "encoder's choice" when the
actual values are not yet known, while still supporting explicit low
bitrate/framerate configurations for WebCodecs use cases.I recommend just un-indenting this manually or the auto-formatter will do weird things to it.
Done
if (current_params_.framerate == 0) {Helmut JanuschkaI'm not an expert on the encoder pipeline, but `current_params_` seems like something controlled by whatever JS call is attempting to do encoding. Does it not make sense to move default value handling to the VideoEncodeAccelerator::Config structure? Shouldn't all encoders have these same defaults, assuming they're good defaults to have? Maybe a user-supplied zero frame rate encode request _should_ fail?
VideoEncodeAccelerator already defines `kDefaultFramerate = 30` (https://source.chromium.org/chromium/chromium/src/+/main:media/video/video_encode_accelerator.h;l=231?q=kDefaultFramerate%20%3D%2030%20file:video_encode_accelerator.h&ss=chromium%2Fchromium%2Fsrc), so there's precedent for this default. I could change this to use that constant instead of a magic 30 to make it clearer this is an intentional fallback.
Ted (Chromium) MeyerAs for whether zero framerate should fail: in WebRTC scenarios, the framerate can legitimately be unknown at initialization time. Failing would break these use cases.
Helmut JanuschkaI suppose I am ok with treating 0 for framerate (and bitrate) as a "do whatever you think is best mr. encoder" message. Maybe it makes sense to ensure that they are both zero to enable this behavior? So if the caller requests 50Kbps and 0FPS, they get a legitimate error, since that makes no sense. If both are zero, then use defaults.
I considered requiring both to be zero, but WebRTC can legitimately have one known and one unknown at init time (e.g.,known target bitrate but unknown framerate). Treating each independently seems more flexible for real-world scenarios.
// 100 kbps is a reasonable minimum for low-resolution video.
constexpr int kMinTargetBitrateKbps = 100;Helmut JanuschkaAs far as I am aware, webcodecs allows these encoders to be used via some JS calls (@eugene, is this correct?). This seems like a very webrtc specific set of minimum values. It's conceivable that someone might truly want video under 100Kbps, what is the reasoning for deciding this is a "reasonable" minimum? Do other encoders have this same minimum?
Ted (Chromium) Meyer100Kbps was the value that made the pipeline stable in testing. The real issue is that WebRTC shows up with 0 bitrate during startup, which breaks the AV1 encoder on AMD/Mesa, it produces frames that decoders reject entirely.
I don't have strong evidence that 100Kbps is the "right" minimum vs something lower like 50Kbps. The goal was just to ensure a non-zero value gets to the encoder.
Happy to lower it to 50Kbps if that seems more reasonable for WebCodecs use cases.
Or alternatively, I could change the check to only kick in when bitrate is literally 0, rather than enforcing a minimum. WDYT?
My gut feeling here is that "zero -> 100" is better than "max(value, 100)", since webrtc isn't the only thing using this file, and undocumented codec/platform specific combinations are going to be a debugging nightmare later. It a perfect world, this would be an optional<int>, so webrtc could legitimately say "give me whatever you think it should be", while still allowing anyone to explicitly set a value to whatever they'd like. Treating 0 as that optional is probably ok though, since it does not actually make sense to have a 0Kbps video.
Done
constexpr int kDefaultBitrateKbps = 100;Helmut Januschkaunsigned type
Done
constexpr int kAmdWarmupFrameCount = 15; // ~0.5 seconds at 30fps for QP limitHelmut Januschkaunsigned type
Done
int target_bandwidth_kbps =Helmut Januschkauint32
Done
Change this (and bitrate_sum) to uint64_t. `bitrate_allocation.GetBitrateBps` already returns an unsigned 32 bit int, so there should never be a case where this is negative.
Then you can drop the " > 0" component in your expression below.
Done
// Log order_hint for AMD debuggingHelmut Januschkaremove
Done
// Warm-up logging for AMD/Mesa debugging (crbug.com/471780477).Helmut Januschkaremove.
Done
// Log DPB state after update for AMD debuggingHelmut Januschkaremove
Done
// Log frame header parameters for AMD debuggingHelmut Januschkaremove
Done
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
| Code-Review | +1 |
I'll also add Eugene as a reviewer since he does a lot with encoders.
// Log order_hint for AMD debuggingHelmut Januschkaremove
Ted (Chromium) MeyerDone
AH! I hadn't intended this to mean "remove the comment" not "remove the log". Comments should be "why" and not "what". The raison d'etre for logs is self evident.
anyway, you can put the logs back in (though maybe making them DVLOGF(2) or DVLOGF(3) might be a bit better). feel free to leave them out if you don't think it's important though.
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
// Log order_hint for AMD debuggingHelmut Januschkaremove
Ted (Chromium) MeyerDone
AH! I hadn't intended this to mean "remove the comment" not "remove the log". Comments should be "why" and not "what". The raison d'etre for logs is self evident.
anyway, you can put the logs back in (though maybe making them DVLOGF(2) or DVLOGF(3) might be a bit better). feel free to leave them out if you don't think it's important though.
oh my fault, yes they will be usefull for "me in 3 months" 😂
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
Hi, I am a bit swarmed these days but I will review this in a week if that's ok.
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |
Hi, I am a bit swarmed these days but I will review this in a week if that's ok.
Supproting a resolutin change in this class seems to be strange.
VaapiVEA doesn't support the dynamic resolution change.
If that actually happens, the webrtc encoder client (RTCVideoEncoder) doesn't work as expected.
| Commit-Queue | +1 |
Supproting a resolutin change in this class seems to be strange.
VaapiVEA doesn't support the dynamic resolution change.If that actually happens, the webrtc encoder client (RTCVideoEncoder) doesn't work as expected.
| Inspect html for hidden footers to help with email filtering. To unsubscribe visit settings. |