I'm not even using audio, I'm using only video.
All my jitter tests are conducted locally (both peers on the same machine), so packet loss doesn't even enter into the equation.
I've placed debug print statements into the jitter buffer code, and it would appear that the jitter buffer increases I observe are simply due to sudden frame size variations (e.g. The difference between max_frame_size and avg_frame_size suddenly becomes larger), which are going to be a natural part of many videos. After a frame size increase such as this, the jitter buff can increase from ~30ms to ~100ms or more, and that kind of latency is very undesirable for certain kinds of apps.
Hardcoding the jitter buffer to 30 in jitter_estimation.cc removes this extra latency and strikes a decent smoothness/latency balance for our particular circumstance.
Anyway, long story short, I'd really like some control over the jitter buffer size from the JS side :)