I'm using x-google-min-bitrate and x-google-max-bitrate to control the bitrate for WebRTC connections. It works fine over UDP, but when streaming over TCP, I need to double the x-google-* attributes to get the bitrates specified. I'm wondering if its because of some sort of TCP overhead, or if it's a bug. I have a report here, but it's waiting to be triaged I believe:
https://bugs.chromium.org/p/chromium/issues/detail?id=717709&can=1&
Anyone have any ideas? Thanks so much.