Hello,
I have an iOS app that broadcasts the user's screen to WebRTC using ReplayKit. I built the app off of the AppRTC demo and got it working, but I don't see a way to pass audio buffers to WebRTC in the same way that video buffers are passed.
Just as we can pass video CMSampleBuffers to an RTCVideoCapturer through a delegate, is there a way to pass the audio CMSampleBuffers to some sort of RTCAudioCapturer? If not, is there some other way to pass system audio to the WebRTC stream? Will there be support for this in the future?