Hi everyone,
if I start a WebRTC conversation in flutter, I open a MediaStream which then starts the microphone, loudspeaker and camera. This works out of the box but I only connect sources with destinations to drive data connections - an access to the actual audio/video data is not accessible in flutter.
Is there a way to write my own MediaStream class to get access to the streaming data? My idea would be to use such a new class to pull in data from an ffi-library (desktop environment) and forward this towards the WebRTC connection.
My feeling is that this might be complicated using the conventional approaches: typically, Flutter objects use member functions and callbacks in async manner which would not be a good basis for real-time data processing. And even the channels between native plugins and flutter are always async.
In html objects I may define member functions in a type of webidl descriptions using webasm
. Maybe, a plugin in flutter can be driven in a similar way?
Thank you and best regards
Hauke