Yet another "Pro" Audio API: AAudio for O

950 views
Skip to first unread message

Felix Homann

unread,
Mar 22, 2017, 4:32:05 AM3/22/17
to android-ndk
Last year's announcement of bringing Jack to every Android device was great. What happened next? Nothing. At least nothing from the perspective of project outsiders.

And now we're getting yet another "Pro" Audio API for Android O. AAudio [1]. Something completely new from scratch, something not-JACK with an Android-only API. Something nobody is familiar with, something that needs at least a little bit of porting effort, something that prevents direct code reuse from other platforms.

Why oh why? Sigh... 

OK, we don't know much about it right now except for a code sample [2]. Maybe it's finally the holy grail of Android Pro Audio. It's just that I'm not very confident giving how each and every Android pro audio announcement turned into a letdown in the end. Moreover, the Jack announcement seemed to even bring the JACK API (not low latency) to older devices. There's no indication that AAudio will be available for older devices. 

How "Pro" will it be? Will it finally really bring low latency audio to Android? I mean real low latency not low latency compared to other Android efforts. Will it finally support audio interfaces with more than 8 channels? Will it again be too little too late?

Maybe I'm not being fair, but I'm really frustrated by this everchanging "pro" audio  meandering.

Kind regards,
Felix


Gerry Fan

unread,
Mar 23, 2017, 1:25:26 AM3/23/17
to android-ndk
Thank you for noticing the samples :-):  they were not linked to many places at the moment-- main purpose is to have some example(s) about how to use new API; I assume header files will be published in the next NDK release pretty soon; so, at that time, some documentation should be available to "argue against" the statement of "we don't know much about it right now except...", and a link to the sample should be in there too. Appreciate your feedback, really!

I wish I had more to answer other questions in your post, but this one should stay and advance: please keep your feedback flowing in.

Thank you again

Felix Homann

unread,
Mar 23, 2017, 12:19:30 PM3/23/17
to android-ndk

Am Donnerstag, 23. März 2017 06:25:26 UTC+1 schrieb Gerry Fan:

I wish I had more to answer other questions in your post, but this one should stay and advance: please keep your feedback flowing in.


Well, maybe for now you might be able to tell me if my 18 channel standard compliant (UAC2) USB audio interface should be supported by AAudio/Android O preview on my Nexus 5X. It would not be worthwhile flashing the preview image otherwise.

Thank you in advance,
Felix

Glenn Kasten

unread,
Mar 23, 2017, 2:30:41 PM3/23/17
to android-ndk

Felix,

I hear your frustration at the pace of development for Android high-performance audio.  
Please know that we are working hard to improve the situation, even when it is isn't visible. 


As of now, the AAudio API documentation hasn't been published, but it is expected very soon. 
There is a short window during which we can address API suggestions, so the sooner we receive your comments the better. 


Regarding multi-channel USB: AAudio is primarily intended to be an easier to use API with potential for better performance in the future. It does not by itself affect USB support. Whether your interface is supported or not should be the same on both AAudio and OpenSL ES, but we have not focused on that yet. I know how important multi-channel USB is to you, and we'll try to continue to improve in that area too, but our first priority right now is a better API as that's what most developers have been asking for.


Please note that this DP1 release is just a preview of the new API. 

For example, it does not include the planned performance improvements. 


Thanks again for your honest feedback, and I'll post a link to the docs as soon as they're available.

Glenn


Felix Homann

unread,
Mar 23, 2017, 5:41:43 PM3/23/17
to android-ndk
Hi Glenn,


Am Donnerstag, 23. März 2017 19:30:41 UTC+1 schrieb Glenn Kasten:


Regarding multi-channel USB: AAudio is primarily intended to be an easier to use API with potential for better performance in the future. It does not by itself affect USB support. Whether your interface is supported or not should be the same on both AAudio and OpenSL ES, but we have not focused on that yet. 



So does AAudio still live on top of OpenSL ES?

Regards,
Felix

Glenn Kasten

unread,
Mar 23, 2017, 8:07:22 PM3/23/17
to android-ndk
The implementation of AAudio in O developer preview is not implemented on top of OpenSL ES.

Athos Fabio Bacchiocchi

unread,
Mar 24, 2017, 12:27:42 AM3/24/17
to andro...@googlegroups.com
Hi all,

We built and ran the sample app on a Nexus 5X. A few observations:

- Latency is way worse than with OpenSL
- The audio processing thread is started by the client code. What about priority?
- Native buffer size and sample rate must still be queried from Java
- Since the API is supposed to be simple to use, it strange to see new concepts like this buffer capacity/frames per burst quantities, and the need for a “TunePlayerForLowLatency” procedure. That looks quite awkward, couldn’t it be performed under the hood?

While I appreciate any effort towards making audio apps development easier on android, and I agree that OpenSL is a complicated and clumsy API, I believe that more than a simple API (with its new, peculiar quirks), developers need an environment that allows audio apps to express their full potential. And while a simple audio API can make life easier, that will only be a part of the whole real-time audio development effort anyway.

Before than the API, I would like to see a few issues addressed, like:

- Latency: not only the lowest possible latency, but also a measurable/predictable one. This is especially important for applications that rely on input to output synchronisation. Nowadays not only latency must be measured by the app (look at how many recorders in the Play Store are trying to address this), but it’s different every time you restart the audio IO.
- Reliable performance: we need audio processing to be given the necessary priority, be able to do parallel processing, and be sure that we won’t run into issues like the CPU scaling one that forced people to look for horrible hacks (yes, I am referring to the “fake touch” trick that was even featured in a google I/O talk!)
- Full support for class-compliant USB devices: now it is limited and the latency makes it unusable (even on the google pixel!)
- Native API for midi IO
- Reliable native audio encoding AND decoding, matching the capabilities of the Java API.
- Nice to have: a way for audio apps to “collaborate”, some sort of inter-app audio solution and maybe a plugin standard (like LV2), so some apps can really just be plugins to be used in other apps.

I hope than Glenn, Gerry or any other Android dev can comment on this (btw thanks for your kind and supportive attitude), and we are looking forward to any additional information about the new API.

Athos


--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To unsubscribe from this group and stop receiving emails from it, send an email to android-ndk+unsubscribe@googlegroups.com.
To post to this group, send email to andro...@googlegroups.com.
Visit this group at https://groups.google.com/group/android-ndk.
To view this discussion on the web visit https://groups.google.com/d/msgid/android-ndk/cb204569-7532-4d3f-b6ea-7836ffabe670%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Felix Homann

unread,
Mar 24, 2017, 6:51:52 AM3/24/17
to android-ndk
Excellent write up, Athos, thank you!

@Glenn
Sure, the API might be simpler than naked OpenSL ES. But actually, even Peter Brinkmann's simplistic opensl_stream made OpenSL ES audio pretty easy to handle.

So, what really puzzles me is what kind of use cases or issues AAudio is designed to address. It's supposed to facilitate "pro" audio on Android, isn't it? But isn't that a big euphemism as long as you can't even use UAC2 devices with more than 8 channels? As long as you don't have access to mixer controls of your otherwise supported 8-channel USB audio device? It seems like the "pro" audio will be restricted to onboard soundcards (how "pro" is that?) and simple 1 or 2 channel USB audio devices that might be usable for guitar amp simulations but not for much more (and then there's latency again...).

And I'm missing some extras over OpenSL ES, like a plugin architecture or JACK-like inter-app routing options (or something along the lines of again Peter Brinkmann's Patchfield). What I see right now seems indeed - and  I'm very sorry to say - too little too late again. 
 
BTW, I know that you know how important support for mutlichannel audio devices is for me  - we've been repeatedly discussing this at least since 2014's Lollipop preview ;-) I just don't own USB audio devices with less than 8 channels so it is vital for me.

Kind regards and have a nice weekend everyone,
Felix


BassMeister

unread,
Mar 24, 2017, 11:54:05 AM3/24/17
to android-ndk
+1 Athos is completely on point.

Phil Burk

unread,
Mar 24, 2017, 4:18:36 PM3/24/17
to android-ndk
Hello,

Thank you for trying out AAudio in O developer preview, and sharing your comments. Again, we’ll publish a link to the online documentation and accompanying NDK release as soon as they are available.


> - Latency is way worse than with OpenSL

This preview release in DP1 is just to demonstrate the API. It currently does not use a FAST track and will have very high latency.


> - The audio processing thread is started by the client code. What about priority?

AAudio provides a thread interface APIs that is not yet running high priority. The sample will be updated to use them.


> - Native buffer size and sample rate must still be queried from Java

Call AAudioStream_getFramesPerBurst(). That is the number of frames that the system consumes in one shot. If you match that size for your reads and writes then you will get optimal performance.


> - Since the API is supposed to be simple to use, it strange to see new concepts like this buffer capacity/frames per burst quantities, and the need for a “TunePlayerForLowLatency” procedure. That looks quite awkward, couldn’t it be performed under the hood?

Yes, doing the tuning in the app is extra work for the app developer. But we wanted to give applications the ability to tune the system to their own needs.


> Latency: not only the lowest possible latency, but also a measurable/predictable one.

The “lowest” part is already answered earlier.

For measurable/predictable, see the AAudioStream_getTimestamp. It provides a presentation timestamp that can be used for this purpose.


> - Full support for class-compliant USB devices: now it is limited and the latency makes it unusable (even on the google pixel!)

We know about the multi-channel > 8, latency, and mixer controls. Any other limits that affect you? That will help us in our prioritization of requests.


> - Reliable native audio encoding AND decoding, matching the capabilities of the Java API.

For this, we recommend native MediaCodec. See:


There is no online documentation yet for the native C language binding, but hopefully the above links are sufficient until then.


Thank you for your other requests also; we will consider them.

Phil Burk

Athos Bacchiocchi

unread,
Mar 26, 2017, 11:09:03 PM3/26/17
to android-ndk
Phil, thanks for taking the time to answer is such detail, much appreciated! I am very happy and looking forward to know more about the progress on this, especially regarding the threading API and the timestamps. And thanks for pointing to the native media codec API, I had overlooked that.

Regarding USB, apart from the latency, 8+ channels and controls support, being able to query for the device's supported configurations would be probably helpful. I understand you are trying to keep it as transparent as possible for the client, but while it's easy to stick to native SR, buffer size, and 1/2 channels with the onboard audio, it's less so with USB devices, with their varying support for different sample rates and number of channels.

Also, native callbacks for changes in audio routing would help, currently we are relying on a java receiver listening for a bunch of heterogeneous events (headset un/plug, bluetooth dis/connect, USB dis/connect), and this is not straightforward to handle.

One last thing, unrelated from USB, would be the ability to control the pre-ADC input gain where available.

Athos

Don Turner

unread,
Mar 27, 2017, 9:40:21 AM3/27/17
to android-ndk
Hi Athos, 

To query the sample rates and channels supported by a USB audio device you can use the Java AudioDeviceInfo class introduced in API 23: https://developer.android.com/reference/android/media/AudioDeviceInfo.html

You may also want to investigate the AcquireJavaProxy method which was introduced into OpenSL ES in API 24 which allows you to obtain an AudioRouting object from the NDK. Here's some sample code: 

    // get the Android configuration interface
    result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_ANDROIDCONFIGURATION,
                                             &configItf);
    // obtain the audio track
    result = (*configItf)->AcquireJavaProxy(configItf, SL_ANDROID_JAVA_PROXY_ROUTING, &audioTrack);

Hope that's helpful, 

Don

Gábor Szántó

unread,
Mar 28, 2017, 11:00:08 AM3/28/17
to android-ndk
To solve the following problems 3rd party tools exists:

- CPU frequency scaling (not the fake touch one, but something which works on all Android devices)
- pro USB audio (user-space solution fully embeddable into a user application)
To unsubscribe from this group and stop receiving emails from it, send an email to android-ndk...@googlegroups.com.

To post to this group, send email to andro...@googlegroups.com.
Visit this group at https://groups.google.com/group/android-ndk.
Reply all
Reply to author
Forward
0 new messages