Guitar tuner using AKAudioAnalyzer

314 views
Skip to first unread message

Stephane Peter

unread,
Mar 11, 2015, 7:08:07 PM3/11/15
to audi...@googlegroups.com
Hey all,

I'm trying to put together my first project using AudioKit on iOS. I'm planning to make a simple guitar tuner as a way to explore the API. If it all works out, I'd like to switch some of my other projects away from OpenAL to AudioKit, and I do need to do some audio analysis, so this seems like a good fit.

I looked at the AudioKitDemo project since it seems to have a good example of how to use AKAudioAnalyzer. However, playing with this a little bit, I noticed that the signal amplitude seems to be abnormally low. On the same device, I can plug in a guitar to an audio adapter and use pretty much any other tuner app. With AudioKit, the audio that seems to get picked up from the microphone is somehow much lower on the same hardware.

Does anybody have experience with dealing with real-time audio input with AudioKit? It seems to me like there is probably something missing in the way the audio input channel is set up (maybe input gain needs to be set higher up). It's definitely doable since I have apps on the same device for which this works properly.

Ideally, I want users to be able to plug in their guitar into their device with one of the many instrument cable adapters available (or even with just the microphone for acoustic instruments). And I'd love to make sure that I can use AudioKit for this.


Aurelius Prochazka Ph.D.

unread,
Mar 11, 2015, 7:14:34 PM3/11/15
to audi...@googlegroups.com
Hi,

I have done a lot of microphone based input processing with AudioKit, but I haven't yet actually plugged a guitar dongle into my iPad/iPhone to see if there is   a level issue.  I suspect that with AudioKit we're using the raw input values where other software might be applying a gain straight away.  Not a problem, you can do that as well using a variety of techniques, the easiest probably being when you first use the microphone scale its values:

... microphone.scaledBy(akp(2.0))...

to double the amplitudes for instance.  As always, if you'd like me to help you out on Screenhero.com, just email me and I'll send you an invite.

Aure

Stephane Peter

unread,
Mar 11, 2015, 11:59:45 PM3/11/15
to audi...@googlegroups.com
Thanks for the hint, I'll give that a try. 

When I was looking a bit deeper yesterday at how AudioKit is implemented over Csound, I noticed that the AVAudioSession object getting set up in CsoundObj.m has some methods to set an input gain, but I didn't see a way to get access to them through AudioKit APIs. Not sure if that's necessarily part of the answer to my problem at this point, but I'm wondering if it would be a good idea to expose this through AKManager at some point in the future.

Stephane

Aurelius Prochazka Ph.D.

unread,
Mar 12, 2015, 12:09:58 AM3/12/15
to audi...@googlegroups.com
The default value of the AVAudioSession is 1.0, which is the maximum.  So, after that your option is a software scaling as I have described.

Aure

Stephane Peter

unread,
Mar 12, 2015, 10:25:08 PM3/12/15
to audi...@googlegroups.com
OK, so I'm still trying to understand how this API works, but I have this code in a new instrument (derived from AKInstrument) :

        microphone = AKAudioInput()

       self.connect(microphone)

       auxOutput = AKAudio.globalParameter()

       self.assignOutput(auxOutput, to: microphone)



Now, I'd like to add the scaling parameter but I'm not sure what's the right way to do this. If I use microphone.scaledBy(akp(2.0)) then the instrument gets no audio at all. What's the proper way to do this?


On Wednesday, March 11, 2015 at 4:14:34 PM UTC-7, Aurelius Prochazka Ph.D. wrote:

Aurelius Prochazka Ph.D.

unread,
Mar 12, 2015, 10:32:44 PM3/12/15
to audi...@googlegroups.com
This works for me.  Change 2.0 to whatever you want, or make it an instrument property that can be set.

class VocalInput: AKInstrument{
    
    let auxilliaryOutput = AKAudio.globalParameter()
    
    override init() {
        super.init()
        
        let microphone = AKAudioInput()
        connect(microphone)
        
        let scaledMicrophone = AKAssignment(input: microphone.scaledBy(akp(2.0)))
        connect(scaledMicrophone)
        
        assignOutput(auxilliaryOutput, to: scaledMicrophone)
    }
}

HTH,
Aure

Stephane Peter

unread,
Mar 13, 2015, 12:32:53 AM3/13/15
to audi...@googlegroups.com
Great, that seems to work! I just didn't know to use the AKAssignment object here...

Thanks for your help!

Aurelius Prochazka Ph.D.

unread,
Mar 13, 2015, 12:37:06 AM3/13/15
to audi...@googlegroups.com
Yeah, I only had to use AKAssignment because you there was no further processing being done inside your example.  Could have also scaled it in the instrument that was receiving the signal.  But since that was Probably AKAudioAnalyzer, I think it would be fine to do it as I outlined.  Lots of different ways to skin that cat.

Aure

Stéphane Peter

unread,
Mar 29, 2015, 1:53:02 PM3/29/15
to audi...@googlegroups.com
Oddly enough, I just upgraded to AudioKit 2.0 and the code above (using AKAssignment) no longer seems to work. I've removed the `connect()` calls as these seem to no longer be needed, but am I missing something here? Has the way this works changed in the API?

If I assign the output to the microphone object, then I get audio, but the scaledMicrophone object no longer seems to do anything.

Aurelius Prochazka Ph.D.

unread,
Mar 29, 2015, 1:54:11 PM3/29/15
to audi...@googlegroups.com
You found a bug!  Thank you, I fixed it in the development branch.

Aure
Reply all
Reply to author
Forward
0 new messages