Hi,
I'm working with Cuttlefish virtual device. Cuttlefish uses native WebRTC for encoding Android screen frames (received over crosvm Wayland socket) and then sending video stream of android UI to client. It's basically like one-way video call where other peer sends video (and sound) and other peer input events over webrtc data channel.
Now the problem that reduces the user experience is jerky video after long pauses i.e.
1. Android shows static view like home screen
2. User waits "long enough" on home screen
3. User swipes home screen
4. User sees Jerky screen transition animation
The user experience becomes good only after some time. I believe webrtc somehow adapts to static view (no new frames -> no bandwidth required for video) and then it takes some time to catch up when suddenly there is need to send 30-60fps 720p screen transition animation.
What would be the best way to improve user experience? In my project I can freely patch webRTC (i.e. /external/webrtc in android source tree) as well as cuttlefish that uses webrtc (i.e. /device/google/cuttlefish in android source tree).