I am working on an iOS application which can send audio and video tracks via WebRTC. I have below things working perfectly fine.
- I can successfully transfer a video to the other peer using the CMSampleBuffer captured by Broadcast Upload Extension.
- I have a method which can use RTCCameraVideoCapturer to capture video from the Camera and create a track and add it to the peer connection.
- I also have another method which can make use of RTCAudioSource and create an RTCAudioTrack which can be added to the peer connection. (Audio from Mic)
However I have a specific Use Case where I need to obtain RTCAudioTrack and RTCVideoTrack from a given AudioVideo file and add these tracks to the peer connection. The specific problems I am facing are,
1) I am unable to cast any of the track types from iOS to WebRTC specific track types.
For Example: I have created an AVAsset Object from an AudioVideo file and got the AVAssetTracks. But I don't know any way to convert these to RTCVideoTrack or RTCAudioTrack.
Question: Does WebRTC provide any direct Objective C method to cast from and iOS Tracks type to WebRTC specific types?
Is there any way I can get the WebRTC specific Track Objects or Stream Object by processing a AudioVideo file?
2) Another question: Does WebRTC for iOS provide any way to process the CMSampleBuffer of Type AUDIO and create an Audio track from it?
If Yes, please direct me to the right direction on this.
Thanks in advance
Suheel S N