Custom h264 stream

1,746 views
Skip to first unread message

Julien Lehuraux

unread,
Dec 16, 2014, 11:37:38 AM12/16/14
to discuss...@googlegroups.com
Hi,

I've studied the WebRtc implementation for a few days now but I can't manage to find how to send my own h264 stream (without restranscoding it).

I've seen that I have two options : videotrack or rtcdatachannel, although I've found someone said the latter may not be optimal for this usage... advices ?

Using peerconnection_client, I've subclassed cricket::VideoCapturer enabling me to send a custom stream. I've tried sending the FOURCC_H264 frame through SignalFrameCaptured but expectingly, it complains that it cannot convert the frame to yuv I420. I've tried some more stuffs, like emitting SignalVideoFrame directly and debug step by step to see how I can change things, seen FakeWebRtcVideoEncoder in a unit test... But I'm completely lost from here.

What are the correct steps to achieve what I'm trying to do ?
Subclassing VideoFrame, VideoFrameFactory, Encoder ? 
Any hint ?

Ivaylo Tsankov

unread,
Dec 20, 2014, 7:25:39 AM12/20/14
to discuss...@googlegroups.com
There isn't easy way to send H264 with current WebRTC stack. You can check custom implementation of H264 stack embedded in WebRTC (there is a topic in the gropu). The stack is coupled with vpx stack for encoding and decoding. The one way you can do this is to stream raw frames from WebRTC application to ffmpeg streaming server then encode the raw and send it to client side application

Jeremy Noring

unread,
Dec 22, 2014, 5:48:05 PM12/22/14
to discuss...@googlegroups.com
On Saturday, December 20, 2014 5:25:39 AM UTC-7, Ivaylo Tsankov wrote:
There isn't easy way to send H264 with current WebRTC stack. You can check custom implementation of H264 stack embedded in WebRTC (there is a topic in the gropu). The stack is coupled with vpx stack for encoding and decoding. The one way you can do this is to stream raw frames from WebRTC application to ffmpeg streaming server then encode the raw and send it to client side application

It would be good to have a way of doing this, particularly for people who want to use WebRTC in an embedded application where using a hardware encoder the only reasonable option.  For example, Texas Instruments has a whole line of DSP/SoC solutions that rely heavily on H.264; it'd be awesome if there was someway to hand WebRTC an encoder interface and have it use that.  In my experience as a video engineer, this is advantageous from a test perspective, because then you can hand your code a "faux" encoder that pukes out known data and make sure you get what you expect on the other end--nice for high-level integration tests.

This would also be immediately awesome for iOS, where someone could easily wrap Apple's H.264 hardware encoder.

Julien Lehuraux

unread,
Jan 5, 2015, 3:58:06 AM1/5/15
to discuss...@googlegroups.com
Thank you guys and happy new year. I managed to dig it deeper some weeks ago : the main problem for this is that videoengine (1 & 2) forces the use of YUV I420 frames and every frame we feed it with is transformed to YUV I420. There's a big dependency on this YUV format and IMO it's not a good idea : colorimetric transformation takes some time (unnecessary time because it depends of the capabilities of the encoder), is only done by the CPU and done sequentially in the current implementation (if I recall correctly). Should'nt the colorimetric transformation be handled by the encoder?
Well, indeed, h264 or whatever custom stream cannot be supported easily by the "videotrack api" with this current limitation.

What I did for now was to send a dummy buffer to videoengine (2 bytes x width x height size : I420 fake frame) repeatedly and then handling my stream in my own encoder. That works ok for my use case (desktop streaming) although I've still some problems with time / framerate / timestamps at the moment that I need to get back to. That's ugly but that works for now with firefox and is enough for my PoC. I may fork the whole thing and remove this I420 dependency later if it's not done.

Jeremy Noring

unread,
Jan 5, 2015, 12:08:00 PM1/5/15
to discuss...@googlegroups.com
On Monday, January 5, 2015 1:58:06 AM UTC-7, Julien Lehuraux wrote:
Thank you guys and happy new year. I managed to dig it deeper some weeks ago : the main problem for this is that videoengine (1 & 2) forces the use of YUV I420 frames and every frame we feed it with is transformed to YUV I420. There's a big dependency on this YUV format and IMO it's not a good idea : colorimetric transformation takes some time (unnecessary time because it depends of the capabilities of the encoder), is only done by the CPU and done sequentially in the current implementation (if I recall correctly). Should'nt the colorimetric transformation be handled by the encoder?
Well, indeed, h264 or whatever custom stream cannot be supported easily by the "videotrack api" with this current limitation.

I'm not sure I follow the issue, but nearly every H.264 encoder I know of prefers some version of YUV I420 (usually the planar version, with Y plan immediately followed by V and U planes, fourcc: YV12).  It's an exceptionally common format in the encoding world.  What H.264 encoder are you attempting to use?

Julien Lehuraux

unread,
Jan 6, 2015, 3:41:01 AM1/6/15
to discuss...@googlegroups.com
The stream is already encoded in H.264 (coming from a nvidia GPU encoding chipset) and I just wanted to pass it from videocapture through the videoengine. The current implementation tries to convert the h264 frames to I420.

Anyway, what if I want to encode a RGB stream or YUV 4:4:4 stream losslessly ? Let's say VP9 lossless (YUV 444), H.264 lossless (OpenH264 may support High 444 profile later), Lagarith (native app)...

Justin Uberti

unread,
Jan 6, 2015, 7:15:21 PM1/6/15
to discuss-webrtc
Did you look into writing a custom webrtc::VideoEncoder plugin for this? That can output a compressed bitstream.

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Julien Lehuraux

unread,
Jan 7, 2015, 3:21:37 AM1/7/15
to discuss...@googlegroups.com
Sure, that's what I actually did :

* VideoCapturer sending a dummy I420 frame (no time lost converting anything) to the videoengine.
* VideoEncoder ignoring this dummy frame and actually grabbing the H.264 frame from the GPU then passing it to the RTP part.

Kaiduan Xie

unread,
Jan 7, 2015, 10:46:17 AM1/7/15
to discuss...@googlegroups.com
Another way is to add a special I420 type where the data is already encoded H.264 stream and to handle this special I420 frame in the video engine pipe, the no-op VideoEncoder just sends the H.264 frame to RTP.

/Kaiduan

Julien Lehuraux

unread,
Jan 7, 2015, 12:05:48 PM1/7/15
to discuss...@googlegroups.com
On the current implementation, the problem when trying to fake any data into I420 is ViECapturer::IncomingFrameI420 (webrtc/video_engine/vie_capturer line 309) : it does memcpy the buffer to 3 different buffers (3 planes y, u, v)... Thus it requires extra memcpies (depends of the size of Y, U, V and the size of the H.264 frame) to get it back in a unique buffer, which have some cost. But that's indeed a solution if the stream is not too heavy.

Kaiduan Xie

unread,
Jan 7, 2015, 1:45:41 PM1/7/15
to discuss...@googlegroups.com
Size of the encoded H.264 stream is way smaller than the size of original YUV data :)

/Kaiduan

xfengtes...@gmail.com

unread,
Jan 13, 2015, 10:23:18 PM1/13/15
to discuss...@googlegroups.com
I have the same case and tried the solution told in this topic, it works but ugly, and there are auto-bitrate issue.
In this case, we replace I420 frame to H.264 frame, and it seems no need more video processor in talk part, just need bypass H.264 frame in videoengine of webrtc part, is there a way for this as my question report as below link? or extract ICE for P2P connection and work with webrtc videoengine, audioengine following vie_autotest?



在 2015年1月8日星期四 UTC+8上午2:45:41,kaiduan写道:

RTC.Blacker

unread,
Jan 16, 2015, 10:29:17 PM1/16/15
to discuss...@googlegroups.com
it will support h264 in webrtc later.  https://bloggeek.me/winners-losers-no-mti-video-codec-webrtc/ 

RTC.Blacker

unread,
Jan 17, 2015, 1:08:19 AM1/17/15
to discuss...@googlegroups.com
Reply all
Reply to author
Forward
0 new messages