Stereo audio playback on android

505 views
Skip to first unread message

Kunal Jathal

unread,
May 15, 2019, 9:14:26 PM5/15/19
to discuss-webrtc
Is this supported? 

Line 110 in the android audio manager (here) says that it is not supported "in combination with OpenSL ES", but Line 37 in the WebRTC Audio Manager (here) makes it seem like it is categorically not supported.... 

Paulina Hensman

unread,
May 16, 2019, 8:55:41 AM5/16/19
to discuss-webrtc
This is possible if you start using the new AudioDeviceModule API, where you create an AudioDeviceModule and pass it into the PeerConnectionFactory. Using a JavaAudioDeviceModule you can call setUseStereoInput/setUseStereoOutput on the builder.

Please note that if you have been modifying any other audio settings through the old API (static calls to WebRtcAudioManager, WebRtcAudioRecord etc) you need start setting these in the AudioDeviceModule builder instead, or they will be ignored.

For reference, take a look at how AppRTC creates a JavaAudioDeviceModule and passes it to PeerConnectionFactory

Kunal Jathal

unread,
Jun 6, 2019, 6:54:01 PM6/6/19
to discuss-webrtc
By any chance, would you know the earliest release version that contains this functionality ?

Paulina Hensman

unread,
Jun 10, 2019, 3:52:14 AM6/10/19
to discuss-webrtc
Yes, that would be M68.

Kunal Jathal

unread,
Jun 10, 2019, 3:21:53 PM6/10/19
to discuss-webrtc
thank you. would you happen to know the meaning of the 'm' prefix before certain release branch names? i've noticed that not all of them have them ...

Kunal Jathal

unread,
Jun 20, 2019, 2:10:46 PM6/20/19
to discuss-webrtc
For anyone with the same question, check this post:
https://groups.google.com/d/topic/discuss-webrtc/MVxATps1keE/discussion

Kunal Jathal

unread,
Jul 8, 2019, 11:17:36 PM7/8/19
to discuss-webrtc
Paulina;

So I'm creating a JavaAudioDeviceModule and passing it to the PeerConnectionFactory as outlined in the examples (and I'm calling setUseStereoOutput(true) on the builder when creating the JavaAudioDeviceModule). However, I'm still not hearing stereo output. The audio I'm testing it with is a sine wave that alternates from being panned hard left to hard right every other second.

What I hear on the android device is actually a mono sine wave for a second, then silence, then a mono sine wave again for a second, then silence etc. So basically it seems like only a single channel is being taken, mixed down to mono, and then being output. Any ideas why this might be happening?

(Btw, I have also made the necessary changes to the sdp to support opus stereo i.e. appending stereo=1;sprop-stereo=1 to the local description prior to setting it... )




On Thursday, May 16, 2019 at 5:55:41 AM UTC-7, Paulina Hensman wrote:

Kunal Jathal

unread,
Jul 9, 2019, 12:03:25 AM7/9/19
to discuss-webrtc
Upon closer listening, it might just be the left channel that is being played (i.e. no mix down to mono).... so audio in the left channel for a second, silence, audio in L, silence ....

Henrik Andreasson

unread,
Jul 9, 2019, 5:12:10 AM7/9/19
to discuss-webrtc
It is not clear to me how the stereo input signal is generated. If it is correct, a downmix to mono might take place in one of the utilized WebRTC components.
As an example, the AEC works on mono only but I can't say for sure in your case. Try disabling all audio-processing components to avoid this downmix and see if it helps.

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss-webrt...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/discuss-webrtc/18fb5a65-6df5-4918-8082-84a7adea4f5a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Kunal Jathal

unread,
Jul 9, 2019, 1:14:07 PM7/9/19
to discuss-webrtc
The stereo input signal is just a hard-coded sine tone that I am reading from memory on another device (so literally just an array of samples). I know the input signal is fine because I don't have any problems when establishing a windows - windows webrtc connection (I correctly hear the stereo sine signal on the receiving end).

I tried disabling all audio processing, but it did not help. These are my peer connection parameters:

    private final int audioStartBitrate = 32;
    private final String audioCodec = "OPUS";
    private final boolean noAudioProcessing = true;
    private final boolean aecDump = false;
    private final boolean useOpenSLES = false;
    private final boolean disableBuiltInAEC = true;
    private final boolean disableBuiltInAGC = true;
    private final boolean disableBuiltInNS = true;
    private final boolean enableLevelControl = false;
    private final boolean disableWebRtcAGCAndHPF = true;


On Tuesday, July 9, 2019 at 2:12:10 AM UTC-7, Henrik Andreasson wrote:
It is not clear to me how the stereo input signal is generated. If it is correct, a downmix to mono might take place in one of the utilized WebRTC components.
As an example, the AEC works on mono only but I can't say for sure in your case. Try disabling all audio-processing components to avoid this downmix and see if it helps.

On Tue, Jul 9, 2019 at 6:03 AM Kunal Jathal <vzw.kun...@gmail.com> wrote:
Upon closer listening, it might just be the left channel that is being played (i.e. no mix down to mono).... so audio in the left channel for a second, silence, audio in L, silence ....


On Monday, July 8, 2019 at 8:17:36 PM UTC-7, Kunal Jathal wrote:
Paulina;

So I'm creating a JavaAudioDeviceModule and passing it to the PeerConnectionFactory as outlined in the examples (and I'm calling setUseStereoOutput(true) on the builder when creating the JavaAudioDeviceModule). However, I'm still not hearing stereo output. The audio I'm testing it with is a sine wave that alternates from being panned hard left to hard right every other second.

What I hear on the android device is actually a mono sine wave for a second, then silence, then a mono sine wave again for a second, then silence etc. So basically it seems like only a single channel is being taken, mixed down to mono, and then being output. Any ideas why this might be happening?

(Btw, I have also made the necessary changes to the sdp to support opus stereo i.e. appending stereo=1;sprop-stereo=1 to the local description prior to setting it... )




On Thursday, May 16, 2019 at 5:55:41 AM UTC-7, Paulina Hensman wrote:
This is possible if you start using the new AudioDeviceModule API, where you create an AudioDeviceModule and pass it into the PeerConnectionFactory. Using a JavaAudioDeviceModule you can call setUseStereoInput/setUseStereoOutput on the builder.

Please note that if you have been modifying any other audio settings through the old API (static calls to WebRtcAudioManager, WebRtcAudioRecord etc) you need start setting these in the AudioDeviceModule builder instead, or they will be ignored.

For reference, take a look at how AppRTC creates a JavaAudioDeviceModule and passes it to PeerConnectionFactory

On Thursday, May 16, 2019 at 3:14:26 AM UTC+2, Kunal Jathal wrote:
Is this supported? 

Line 110 in the android audio manager (here) says that it is not supported "in combination with OpenSL ES", but Line 37 in the WebRTC Audio Manager (here) makes it seem like it is categorically not supported.... 

--

---
You received this message because you are subscribed to the Google Groups "discuss-webrtc" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss...@googlegroups.com.

Kunal Jathal

unread,
Jul 9, 2019, 8:53:01 PM7/9/19
to discuss-webrtc
Never mind, I figured it out -- I was making a mistake in how I was outputting audio. In my audio engine, I have a setting that grabs audio samples from webrtc (via ondata) and processes them, thus overriding webrtc's own audio output. I had erroneously left this setting on.

The JavaAudioDeviceModule with setUseStereoOutput works fine. Thanks for the help Paulina & Henrik.
Reply all
Reply to author
Forward
0 new messages