Open SL Streaming Audio recording

407 views
Skip to first unread message

WB

unread,
Mar 21, 2011, 7:39:21 PM3/21/11
to android-ndk
I'm have some strange result when trying to work with the NDK sample
code that demonstrates the use of recording and playing using the NDK
OpenSL implementation. I am requesting 16-bit, 8KHz, mono, PCM data,
and I provide 10 buffers of 120ms each for it to work with. Results:

- The timing of each callback is 120ms +-15ms, so that is good.
- What I hear is a burst of sound, and then it echo's every ~120ms,
and each time is diminishes in amplitude.
- The echo has faded out after about 0.5-0.75s
- This repeats every ~1.9s
- The segment of audio hear does sound like the proper sound.
- I record the data at each callback into a *.raw file and have
verified the output

I'm testing on a Nexus One.

What might be the issue?

- Should the buffers be of some specific size?
- Should they be of a minimum size
- The sample code only queued one buffer, but say to enqueue more for
streaming (which I do)
- would the echo effect be cause by some sort of a mix mis-
configuration?

Has anyone worked with this library yet? Or manager to get streaming
audio recording running.

Glenn Kasten

unread,
Mar 22, 2011, 4:59:34 PM3/22/11
to android-ndk
Can you post the source code?

WB

unread,
Mar 24, 2011, 11:06:30 AM3/24/11
to android-ndk
After further investigation we were able to track down out issues, and
the problem was
located in our own code.

What we haven't been able to do is determine how to determine which
API to use
for controlling which speaker is used for Audio playback (a little bit
of a different
topic than the subject of the original post).

We generally find that a number of OpenSL API that we'd like to use
for querying
the audio/device capabilities return a "12", which means that it's not
supported.
Is there a document that outlines what APIs are supported under
Android.

And, is anyone aware of the ability to select the ear piece speaker
(as you would
in a phone call).

WB.

mic _

unread,
Mar 24, 2011, 11:31:26 AM3/24/11
to andro...@googlegroups.com
>>And, is anyone aware of the ability to select the ear piece speaker (as you would in a phone call).

You can set the stream type, which implictly changes the output device. But I don't think you can explictly set the output device through OpenSL ES. The VOICE_CALL stream should route to the EARPIECE output device by default, but it may not be a good idea to use that stream for playback since routing a a VOICE_CALL stream can have side-effects that differ from other streams on some platforms.

/Michael

--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To post to this group, send email to andro...@googlegroups.com.
To unsubscribe from this group, send email to android-ndk...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.


WB

unread,
Mar 24, 2011, 3:06:40 PM3/24/11
to android-ndk
I can find the following:
#define SL_ANDROID_STREAM_VOICE ((SLint32) 0x00000000)

in : SLES/OpenSLES_AndroidConfiguration.h

But I'm not sure how to configure our AudioPlayer to use this stream.

WB

mic _

unread,
Mar 24, 2011, 3:09:15 PM3/24/11
to andro...@googlegroups.com
There's a setConfig method in the android_audioPlayer. I don't know
how to call it.. you'll have to look in the OpenSL ES code.

Glenn Kasten

unread,
Mar 24, 2011, 4:55:33 PM3/24/11
to android-ndk
> Is there a document that outlines what APIs are supported under
Android.
See $NDKROOT/docs/opensles/index.html section "Supported features from
OpenSL ES 1.0.1".

> But I'm not sure how to configure our AudioPlayer to use this stream.
See example code fragment in section "Android configuration interface"
of same document.
Reply all
Reply to author
Forward
0 new messages