NDK and future instruction sets

22 views
Skip to first unread message

Biosopher

unread,
Jan 6, 2010, 3:55:16 PM1/6/10
to android-ndk
Looks like I need to port my digital sound processing code to the NDK
for the performance I'm looking for. A few questions though.

The NDK docs say that if I use only NDK library headers, then my
application should not break if the user performs an over-the-air
system update. Great! It also says that the only current supported
instruction set is ARMv5TE. As all public devices are ARMv5TE, then
that's great as well.

However once other instructions sets are used, how do people predict
we will go about compiling for them? Ideally the Android NDK will
always ship with the ability to compile across all instruction sets
and will dynamically determine which native implementation to use. Is
this what people expect?

My guess is people would simply respond "YES" to the above.
However...what cross-platform concerns might I be missing here? If I
do code only to the NDK library headers, is there still anything else
I should be concerned about as more instruction sets are introduced?

Sorry to be so vague here. I've never dealt with native
implementations before and want to fully understand potential problems
that might arise.

Kevin Duffey

unread,
Jan 6, 2010, 11:15:23 PM1/6/10
to andro...@googlegroups.com
First of all.. can you elaborate on your sound stuff? I have been itching to build a drum machine like app with real-time capabilities, multi-touch pads, etc. Thus far, it seems impossible to do because Android has not provided low-latency audio libraries for us to use. Even if we do mixing of samples in the NDK level, the chances of playing a sound almost as soon as a pad is touched on the screen from what I've read thus far is pretty much impossible to do, much less multiple pads at once. From what I've read on various forums, until Google builds in some sort of access to low-latency hardware in some manner, we're sort of stuck with very basic audio capabilities. I could be wrong on this, I've only found a few posts and one online webinar video talk that indicated this limitation of the android platform regardless of actual hardware available. I find it hard to believe my Moto Droid is any less capable of real-time music apps than an iPhone... but apparently the iPhone SDK provides low latency apis to the sound hardware... and naturally they only have the one phone to worry about.

But, your assumption should be doubly correct... if you build to the NDK headers, it will work on all devices that support it. As well, if new hardware comes out for android devices, google should be responsible for updating the NDK to support those hardware platforms. Which is great.. but then.. how do WE compile our code for the different hardware? I could be wrong..I've not done any NDK stuff yet.. but I assume developing on Ubuntu with gcc allows me to build to the arm processor just fine. I don't know other than getting the NDK, if at this point I need to install a special C compiler and such as well, or if they bundle it with the NDK. I am guessing it's not part of the NDK and we're responsible in some way to get it. From what I've read, on Windows, developers gotta install cygwin for it to compile. I hope I don't need anything else for GCC.. perhaps maybe an ARM compiler tho..not sure if out of the box it can produce arm instruction code (does it?). 

I just hope we get solid audio.. and better video SDK/API access sooner than later. It's going to be tough to compete with iPhone on games and audio apps until we do, and as they are (combined) the bigger pie of apps that most users want and willing to pay for, it would be greatly beneficial to those of us that want to produce music apps and better high-end video games (and apps) for google and team to get something in place sooner than later. Perhaps by the 2.5 release (end of year 2010?). If Android is predicted to take over iPhone market share by 2012, we're going to need some sort of API/SDK by end of this year or we won't have time to develop apps and enough time for them to be on the market by 2012. :D 



--
You received this message because you are subscribed to the Google Groups "android-ndk" group.
To post to this group, send email to andro...@googlegroups.com.
To unsubscribe from this group, send email to android-ndk...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/android-ndk?hl=en.




David Turner

unread,
Jan 7, 2010, 5:47:31 AM1/7/10
to andro...@googlegroups.com
On Wed, Jan 6, 2010 at 12:55 PM, Biosopher <aste...@gracenote.com> wrote:
However once other instructions sets are used, how do people predict
we will go about compiling for them?  Ideally the Android NDK will
always ship with the ability to compile across all instruction sets
and will dynamically determine which native implementation to use.  Is
this what people expect?


Yes, the NDK will be updated to provide everything necessary to build your
source code for multiple CPU instruction sets, without changing anything to
your Android.mk files.

Android Market will filter the list of applications provided to your device
based on its CPU. So the worst case is that your ARMv5 device will not see
applications that require ARMv7 or x86 machine code. If you try to manually
install such an application, the Package Manager will refuse to install it.
 
My guess is people would simply respond "YES" to the above.
However...what cross-platform concerns might I be missing here?  If I
do code only to the NDK library headers, is there still anything else
I should be concerned about as more instruction sets are introduced?

Sorry to be so vague here.  I've never dealt with native
implementations before and want to fully understand potential problems
that might arise.

Biosopher

unread,
Jan 7, 2010, 12:44:33 PM1/7/10
to android-ndk
Hi Kevin,

I'm writing a processing intensive digital sound processing app
(requires numerous (50,000) Fast Fourier Transforms - FFT). We are
using the new AudioRecord api (v1.5 Donut) to access the raw PCM data
acquired from via the phone's microphone. The PCM data is then
transformed into a unique fingerprint that we then use to perform
audio recognition for background song determination.

For usability reasons, we need the sound processing to ideally occur
in the sub-second range. My Java implementation was taking ~10
seconds which was unacceptable from a user perspective. I've since
begun porting the code to Android's native NDK solution. Initial
tests are promising so we expect to have a solution performing our
FFTs in ~1 second.

I have implemented some audio playing functionality for a few games
and ran into some low latency issues. Pre-loading the sounds resolved
those issues though. In your case, it appears you want to capture a
touch gesture, compute a sound, and then dynamically generate and play
that sound. I don't have experience there. It would seem the only
challenge would be with playing the generated sound. So am I to
assume that the various sound generating apps out there don't require
the same audio generation performance that you require: Ocarina,
Band, SOnic Vox.

E.g. as Smule says of Sonic Vox: "The Sonic Vox iPhone app from Smule
is a brand new audio engine that allows you to do real-time audio
processing on the Apple cell phone."

Biosopher

unread,
Jan 7, 2010, 12:48:04 PM1/7/10
to android-ndk
Hi David,

Ideally we could have a single .apk to support multiple machine codes
or the marketplace would support our uploading an .apk for each
instruction set type.

Is that what the Android team envisions? We don't mind supporting too
many instruction sets as long as the creation and distribution aspect
is simplified.

Thanks

David Turner

unread,
Jan 7, 2010, 1:02:22 PM1/7/10
to andro...@googlegroups.com
I can't announce much for now, but care has been taken to make this as simple as possible for application developers.

Kevin Duffey

unread,
Jan 7, 2010, 4:07:06 PM1/7/10
to andro...@googlegroups.com
David..  assuming I am correct and you work for google (or part of android platform based on your email), I understand the need to keep things quiet until fleshed out and ready for alpha/beta use. I would just like to know if we're going to see something soon that will allow us to not only potentially utilize low latency audio for games and music apps (playback), but also the ability to respond to touch responses fast enough to play a sound almost immediately. The current crop of apps I've seen seem to be about 1/2 a second delay or so. They definitely are not responsive enough to compare to a program like BeatMaker on iPhone. I would love to know, at least on the audio front, that sometime sooner than later, we'll see such capabilities for Android as well.. and hopefully regardless of device.. or at least those that are 2.0+ compatible. I am guessing a lot of 1.6 devices may not upgrade to 2.0 due to larger size and requirements possibly, but it sounds like some may allow for it still. It would of course be great if there was a low resolution timer available that we could use for game loops, audio synch and so forth... not sure if that is possible at the NDK level or not yet.

Biosopher.. thus far the few free apps I've tried have not been very responsive, and from the video blog I saw from a few months ago, it didn't sound like they were hopeful that anytime soon we'd have these capabilities. The OpenCL looked promising, but there is no time frame as to when we may see something like that for Android.

Biosopher

unread,
Jan 7, 2010, 4:20:22 PM1/7/10
to android-ndk
Hi Kevin,

Sorry for my confused ramblings above. I code on both the iPhone and
Android so had a slight brain fart just now. The apps I listed are
only available on the iPhone. Yes...Android has major limitations
when it comes to accessing audio/video functionality.

I doubt these limitations will change if we continue to rely only on
the official Android team. People like us in the community must start
being more involved in submitting upgrades to the Android project.
That said as David says, the 'official' Android team (aka Google)
needs to be far more open about what's going on otherwise non-official
contributors like is will never be able to fully contribute to the
effort at the level we would like.

Kevin Duffey

unread,
Jan 7, 2010, 4:24:48 PM1/7/10
to andro...@googlegroups.com
Thank you for clearing that up... I was starting out on iPhone, but the Objective-C language, and having to own a Mac to work on it has turned me off. Being a Java developer for 11 years, the development for Android is far easier and being able to do it on any IDE, any platform, makes it far better to get into developing for Android than iPhone.. not to mention the likelihood of being rejected for iPhone. On another thread, there is mention of the 24MB heap limit for apps.. do you know what the iPhone limit is in comparison? Sorry it's a bit off topic, be happy to take a private email if that is better.

David Turner

unread,
Jan 7, 2010, 5:31:46 PM1/7/10
to andro...@googlegroups.com
On Thu, Jan 7, 2010 at 1:07 PM, Kevin Duffey <andj...@gmail.com> wrote:
David..  assuming I am correct and you work for google (or part of android platform based on your email), I understand the need to keep things quiet until fleshed out and ready for alpha/beta use. I would just like to know if we're going to see something soon that will allow us to not only potentially utilize low latency audio for games and music apps (playback), but also the ability to respond to touch responses fast enough to play a sound almost immediately. The current crop of apps I've seen seem to be about 1/2 a second delay or so. They definitely are not responsive enough to compare to a program like BeatMaker on iPhone. I would love to know, at least on the audio front, that sometime sooner than later, we'll see such capabilities for Android as well.. and hopefully regardless of device.. or at least those that are 2.0+ compatible. I am guessing a lot of 1.6 devices may not upgrade to 2.0 due to larger size and requirements possibly, but it sounds like some may allow for it still. It would of course be great if there was a low resolution timer available that we could use for game loops, audio synch and so forth... not sure if that is possible at the NDK level or not yet.


I do work for Google. I designed and maintain the NDK, among other things. I can't comment on low-latency audio access either from Java or the NDK however because I'm not directly responsible for this. I know that the media team is working on ways to make multimedia development on Android much better but I hate to give details or estimates regarding work I'm not directly involved with, so I won't.

At a very minimum, we're targetting providing OpenSL ES for audio access through the NDK (we prefer providing standard APIs whenever that makes sense). I can't say anything else.
Reply all
Reply to author
Forward
0 new messages