I'm working on an App that needs a good pitch detection. Sadly, I have problems with low notes for example of guitars.
What I tried:I got the detection going with help of the Audio Kit Demo (AKAudioAnalyzer). To improve the detection I maxed hopSize and tried different peaks parameters for AKTrackedFrequency. I also added a LowPass (750 Hz) to isolate the signal and experimented with different EQ settings I determined by analyzing the signal externally. I recognized that all my Apple devices (iPhone 5s, iPad 4th Gen, MacBook Pro Retina 13") lose signal below 120 Hz and push around 200 Hz. So the recognition of notes with a fundamental of around 100 Hz very often get's tracked one octave to high. Differences between the fundamental and first overtone are about 20-30 dB. I tried to inverse this with corresponding EQ settings but couldn't get much better results.
What I wanted to try but wasn't yet able to:
AKTrackedFrequency uses multiple peaks to determine the fundamental frequency. I suppose the algorithm for that has something like a range between the first two partial frequencies and "thinks" 20-30 db are to high to make sense for a relation between these two peaks. I'd love to look into that algorithm and probably improve it for lower frequencies but cannot find out where to look since AKTrackedFrequency has like no functions at all and the documentation doesn't tell how it does what it does.
Call for Help:
Can someone help me find the place where the peaks of AKTrackedFrequency are determined?
Thanks a lot!