Android simulcast resolution problem with TextureBufferImpl

Skip to first unread message

Daniel Novak

Nov 6, 2023, 8:22:07 AM11/6/23
to discuss-webrtc
We are running into a very strange but relatively easy to reproduce issue with basically any webrtc version + the standard java wrapper.

It only happens if you use simulcast and you are streaming for example 3 resolutions from your device instead of just one. Portait works fine, rotating screen to right works well, but rotating the screen to left will cause the HardwareVideoEncoder to basically stop working (locally everything looks fine, but other participants will not see you).
Streaming just one resolution instead of three fixes the problem (this is configured with the standard RtpTransceiverInit). It may sound strange why only screen rotate to left, but that is because different buffers are used in that case due to some optimisations (TextureBufferImpl vs

The Java webrtc code is switching between TextureBuffers and ByteBuffers and it's normal behaviour that in left-landscape rotation the HardwareVideoEncoder starts using a TextureBuffer instead. In the HardwareVideoEncoder.encode(...) function you will find this and you can check that useSurfaceMode is false in portait, but in left-landscape it is true (for whatever optimisation reasons).

if (useSurfaceMode) {
returnValue = encodeTextureBuffer(videoFrame, presentationTimestampUs);
} else {
returnValue = encodeByteBuffer(videoFrame, presentationTimestampUs);

What we are seeing is that the following function in is called with VideoFrame objects that just have one resolution instead of all the three requested:

public VideoCodecStatus encode(VideoFrame videoFrame, EncodeInfo encodeInfo) {
// VideoFrame resolution here is incorrect - it should be alternating between
// 3 resolutions (simulcast) but instead it's just one resolution.
// The difference is in the videoFrame.getBuffer() -
// in portrait it's WrappedNativeI420Buffer, in landscape it's
// TextureBufferImpl and this one seems to be bugged

The WrappedNativeI420Buffer seems to behave correctly - it has the correct resolution. But TextureBufferImpl has always just the highest resolution set and it's never changed.

private TextureBufferImpl(int unscaledWidth, int unscaledHeight, int width, int height, Type type,
int id, Matrix transformMatrix, Handler toI420Handler, YuvConverter yuvConverter,
RefCountMonitor refCountMonitor) {
this.unscaledWidth = unscaledWidth;
this.unscaledHeight = unscaledHeight;
this.width = width;
this.height = height;

If you search how this TextureBufferImpl is created than you will see that width/height and unscaledWidth and unscaledHeight are basically "constants". The width and height are set to the texture size which there is only one (the actual camera resolution). This doesn't seem to cover the case when you have a simulcast setting with multiple resolutions. So webrtc native then takes this TextureBufferImpl instance and does nothing with it basically (since in left-landscape there is no i420 transformation - the WrappedNativeI420Buffer constructor is not called) and then it's fed back to the encoder and the encoder stops encoding here:

        if (outputBuilders.size() > MAX_ENCODER_Q_SIZE) {
            // Too many frames in the encoder.  Drop this frame.
            Logging.e(TAG, "Dropped frame, encoder queue full");
            return VideoCodecStatus.NO_OUTPUT; // See webrtc bug 2887.

I am almost sure that this is some webrtc Android bug that is only visible when you stream multiple resolutions and only in certain device orientations (left-landscape) and it should be easy to reproduce in a small sample (I can try to set it up).

Reply all
Reply to author
0 new messages