Hi,
I have an iOS native app using webrtc for p2p communication. One of my use cases require to record screen video and audio as well. I am able to recored the video using ReplyKit RPScreenRecorder's;
- (void)startCaptureWithHandler:(void (^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError *error))captureHandler
completionHandler:(void (^)(NSError *error))completionHandler;
I am able to add the sample buffers if they are of type video (RPSampleBufferTypeVideo). But I don't get any audio buffers when WebRTC session is in progress.
I wrote a small sample and took WebRTC out and I am able to record both video and audio.
I spent a lot of time on this in the past one week and couldn't figure out a way to record audio (both local and remote) of a WebRTC session.
Please let me know what am I missing. Is there any way to get the sample audio buffers for both local and remote so that I can add them to my
AVAssetWriterInput?
I am completely blocked on this now :-(
Thanks,
--Suman