Using WebRTC to build a sort of basic client/server video surveillance software

1,210 views
Skip to first unread message

Matteo Battaglio

unread,
May 25, 2012, 5:47:59 AM5/25/12
to discuss...@googlegroups.com
Hi everybody,
I was investigating whether it was feasible to use WebRTC to build a web-based client/server software which does the following:
1. the server receives video streams from non-webrtc sources (e.g. IPCameras);
2. it transcodes them to vp8 if necessary;
3. it sends them to each connected client (browser) through WebRTC.

Is that accomplishable in your opinion? Do you have any advice to help me getting started?
I suppose I should use the Native APIs.. right?

Thanks in advance!

David Narvaez

unread,
May 25, 2012, 1:54:37 PM5/25/12
to discuss...@googlegroups.com

I wouldn't mess with WebRTC for a task that is a straightforward application of icecast or fluendo.

David E. Narvaez

Matteo Battaglio

unread,
May 26, 2012, 5:28:59 AM5/26/12
to discuss...@googlegroups.com
The key point of my idea was to build a web client that would be able to decode and render multiple live videos at the same time without requiring any sort of plugins, just using standard web/HTML5 technologies; as far as i know this is only achievable through WebRTC.
That's why I was looking for a way to comunicate with WebRTC from the server side.
In your opinion is it possible to do so with GStreamer or icecast?

David Narvaez

unread,
May 26, 2012, 7:06:16 AM5/26/12
to discuss...@googlegroups.com
On Sat, May 26, 2012 at 4:28 AM, Matteo Battaglio <matteo.b...@gmail.com> wrote:
The key point of my idea was to build a web client that would be able to decode and render multiple live videos at the same time without requiring any sort of plugins, just using standard web/HTML5 technologies; as far as i know this is only achievable through WebRTC.
That's why I was looking for a way to comunicate with WebRTC from the server side.
In your opinion is it possible to do so with GStreamer or icecast?

Sure. Take those cameras, stream their content to different mounts on an IceCast server, and setup a web server to display a number of nicely arranged <video> tags all pointing to the different mounts on the IceCast server. It shouldn't take you long to set up a barebones demo.

You should really read again about the purpose of WebRTC, and take a closer look at streaming projects - they have nothing to do with standard/html5 technologies, but all to do with streaming which is all you really need to do. Plus, WebRTC is far from a finished standard and you won't find a mainstream browser today having stable support for that (not that you need it, though).

And I meant Flumotion, not Fluendo, on my first e-mail - sorry for that!

David E. Narváez

Matteo Battaglio

unread,
May 27, 2012, 4:57:50 PM5/27/12
to discuss...@googlegroups.com
How would I point the <video> tags to the live video streams without using the WebRTC protocol client-side?
Isn't WebRTC born to bring real-time audio-video capabilities to HTML, which otherwise lacks them?

Lorenzo Miniero

unread,
May 28, 2012, 1:55:39 AM5/28/12
to discuss-webrtc
Live video in an HTML5 <video> tag can be already achieved as David
explained. Pick up one of the already available live streaming
solutions, and use the live video mountpoint (which will be an HTTP
address) in the video tag. WebRTC adds bidirectionality to this.

L.

Matteo Battaglio

unread,
May 28, 2012, 12:42:57 PM5/28/12
to discuss...@googlegroups.com
I apologize for my evident ignorance on the subject, but I'm a bit confused, and I'd like to get a better understanding of the current situation for live video in HTML.
As far as i know live video support in HTML is not already there or, if you want, it's there by the means of Apple's HTTP Live Streaming draft, which presents some issues according (for example) to these resources i found:
Furthermore, the following confuse me even more:
Comment #39 to the same issue states: "You are going to support RTP+SIP in the form of webrtc. [...]"
Comment #40 links to the issue 109652: Support MPEG-DASH; comment #2 to this last issue states: "We plan to support DASH and similar adaptive streaming solutions by using a JavaScript library and the MediaSource API [...]".

So, according to what i read over the internet i just thought that WebRTC was born (also, but not limited) to become the standard way to provide live streams to html pages..
But it seems that it is me that's missing something, so could you please clarify me a bit the situation? Should I look at the MediaSource API? Or it's just that it's now possible to use RTP urls with the video tag, without having to rely on WebRTC? Or I ultimately have to stick with HTTP Live Streaming for my purposes?

Thanks, and excuse me again if i'm going off topic!

David Narvaez

unread,
May 27, 2012, 5:58:54 PM5/27/12
to discuss...@googlegroups.com
On Sun, May 27, 2012 at 3:57 PM, Matteo Battaglio <matteo.b...@gmail.com> wrote:
How would I point the <video> tags to the live video streams without using the WebRTC protocol client-side?
Isn't WebRTC born to bring real-time audio-video capabilities to HTML, which otherwise lacks them?

It's very easy, I even compiled various sources of information for you in this link:


David E. Narváez
Reply all
Reply to author
Forward
Message has been deleted
0 new messages