WebGL Canvas to WebRTC outgoing stream?

247 views
Skip to first unread message

Thor Harald Johansen

unread,
Mar 7, 2014, 11:33:25 AM3/7/14
to discuss...@googlegroups.com
I wrote a long post about this and it apparently disappeared, so this post will be shorter: Can we get a way to render to a canvas and encode this as a stream? It would enable vision mixing in the browser. There are good reasons as to why this is a needed feature.

Silvia Pfeiffer

unread,
Mar 8, 2014, 4:23:36 AM3/8/14
to discuss...@googlegroups.com
Might be a good discussion to start on WHATWG or public-html at W3C.
Silvia.
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "discuss-webrtc" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to discuss-webrt...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Jesús Leganés Combarro

unread,
Mar 11, 2014, 7:19:41 PM3/11/14
to discuss...@googlegroups.com
Can we get a way to render to a canvas and encode this as a stream? It would enable vision mixing in the browser. There are good reasons as to why this is a needed feature.

You can render WebGL to a canvas, generate a stream from the frames, and send it over WebRTC. I have some similar experiments on my to-do list, the bricks are already available or they are in the way :-)

pablo

unread,
May 1, 2014, 12:20:51 PM5/1/14
to discuss...@googlegroups.com


On Wednesday, March 12, 2014 1:19:41 AM UTC+2, Jesús Leganés Combarro wrote:
Can we get a way to render to a canvas and encode this as a stream? It would enable vision mixing in the browser. There are good reasons as to why this is a needed feature.

You can render WebGL to a canvas, generate a stream from the frames, and send it over WebRTC. I have some similar experiments on my to-do list, the bricks are already available or they are in the way :-)

Can you explain how to send a stream from a canvas?
Reply all
Reply to author
Forward
0 new messages