Stereo audio playback OUTSIDE of WebRTC on iOS

271 views
Skip to first unread message

vzw.kuna...@gmail.com

unread,
Feb 20, 2018, 8:01:20 PM2/20/18
to discuss-webrtc
I am successfully able to get the remote audio stream data (via AudioTrackSinkInterface::OnData), and I'd like to pass this audio data to another audio framework (OpenAL) to play back the remote audio stream + some other stereo audio on iOS. My problem is that WebRTC -- upon establishing a peerconnection -- seems to configure the native audio layer (coreaudio?) to output in mono.

I know WebRTC cannot playback stereo audio, and that's fine. I don't want WebRTC to output any audio -- more specifically, I'd like is for WebRTC to not configure/tamper the native audio output (AVAudioSession/Coreaudio) to mono. I tried simply setting isAudioEnabled to NO in the RTCAudioSession, but then the catch is that AudioTrackSinkInterface::onData doesn't get called anymore.

So to summarize, is there a way for me to be able to grab the remote audio data while having WebRTC not play audio / not set CoreAudio to output in mono ?

Any pointers to how/where I should go about modifying the source would be appreciated, thanks!

vzw.kuna...@gmail.com

unread,
Mar 19, 2018, 7:21:50 PM3/19/18
to discuss-webrtc
solved this, hopefully it helps someone -- I set the Audio Unit to kAudioUnitSubType_RemoteIO in voice_processing_audio_unit.mm.
Reply all
Reply to author
Forward
0 new messages