Hi,
I'm using the C++ library, and I have been working toward streaming audio (from disk) and receiving it remotely.
I have it working fine with 1 channel audio, but cannot get it to work correctly with > 1 channel (stereo)
Background, i'm using an approach several on here seem to be using with success:
- I'm receiving audio (playback) by implementing a custom ADM, storing a pointer to the AudioTransport callback (registered by the system in RegisterAudioCallback()), and then I call ::NeedMorePlaybackData() on that callback. That triggers OnData() to get called on my custom AudioTrackSinkInterface object...
- I'm sending audio (recording) by for each AudioSource I instantiate a corresponding implementation of AudioSourceInterface (derived from LocalAudioSource, in my case). The library calls AddSink() on that, I hang onto that sink and call ::OnData() on it whenever I've got 10ms worth to send. I'm sending 16-bit interleaved PCM at 48k.
As mentioned, this works when I'm sending and receiving mono audio, but not stereo.
With stereo, no matter what I do my playback callback (AudioTrackSinkInterface::OnData()) is called with number_of_channels set to 1!
I've walked through the webrtc code, following from NeedMorePlaybackData(480, 4, 2, 48k, etc) down through AudioMixer::Mix(), GetAudioFromSources(), GetAudioFrameWithInfo(), etc... however nowhere does it seem to use the 2 i pass in for
number of channels. The '1' for number of channels that ultimately gets called on my ::OnData() callback comes from the audio_frame itself, not what I pass in...
AudioMixer::Mix() has audio_frame_for_mixing, which has 2 channels, but it seems the audio_frame pointer in my 'SourceStatus' object always has num channels set to 1. Is there a way to designate the source as stereo when I create it...?
Any suggestions/ideas on what I might be doing wrong?
Thanks,
james