Aleks, did you get any further? I am interested in finding out whether it is possible to send audio input to server which will be broadcasted to multiple clients via websockets.--
---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
If you are familiar with GStreamer, at Kurento we have just released webrtcbin: a GStreamer component capable of receiving WebRTC flows into a GStreamer pipeline. You can download source here: https://code.google.com/p/kurento/source/checkout?repo=gst-plugins-webrtc
El sábado, 9 de febrero de 2013 23:22:57 UTC+1, Aleks Clark escribió:Greetings,I would like to be able to instantiate a 'receiver' for webRTC streams that receives A/V and passes it off to ffmpeg/vlc/whatever to be transcoded, mixed with other audio, etc. From what I've seen, my best bet seems to be to take the libjingle peerconnection_client example, and hack on it to do what I want it to do. Since I'm pretty new to multimedia programming, can anyone provide guidance on this? Should I just pass the raw VP8/Opus stream off to STDIO or a unix socket?Secondly, it seems a major failing that with all the effort put into peer-to-peer connections, the simpler use case of peer-to-server is ignored and/or undocumented, but maybe I'm not looking in the right places?