Hello,
We know WebRTC was designed for p2p; b-frames increase playback latency, so H264 in WebRTC should not contain b-frames.
However, modern WebRTC use is beyond p2p and communication; server-based solutions allow streaming of any H264-encoded content to WebRTC players in browsers.
Regular IPTV channels containing b-frames in H264 streams, play weirdly with WebRTC: video jumps back and forth - obviously, that's a b-frames issue.
We see that problem with our Unreal Media Server when we ingest an IPTV HLS stream and (without video transcoding, but with transcoding AAC to Opus using Unreal Live Server) stream it to WebRTC player.
You might say - that's a bad idea to play IPTV via WebRTC, and I don't necessarily disagree with you, but pure technically - why not? Should be possible.
Native c++ webrtc code does not provide any way to supply PTS and DTS with coded frame. It's just one timestamp on the coded video frame (H264 NALU). I know that.
But ffmpeg H264 decoder that WebRTC uses, can perfectly decode a stream with b-frames; in fact, any decent H264 decoder can do that.
It's rather a video renderer problem that does not receive the right presentation timestamps from decoder; it assumes that timestamps passed in RTP are presentation timestamps, which is not true for b-frames.
So maybe IPTV playback in WebRTC player is not that practical, but there are high-end live encoders that encode high profile with b-frames, and yes, there will be a little latency penalty, but big bandwidth savings using b-frames,
so that is a very practical use. One very available example to reproduce that issue - take free FMLE live encoder and push encoded video (H264 High profile) to Unreal Media Server. Then play with WebRTC player.
So just a feature suggestion here. Would be nice if you don't force the server side to pass both DTS and PTS (in fact it's not easy via RTP), but handle it in the player (decoder re-arranges presentation timestamps and passes them to video renderer).