I started playing a bit with iOS development and I am wondering if there's any community or "must-have" resources for iOS audio development specifically - forums, tutorials, articles... Even if you know about some very good resources about iOS development or audio (and MIDI) programming in general, it would be interesting for me.
There's a lot of tutorials on iOS/swift all over the web but of course, 99% is not usually worth it. I am experienced (I dare to say "professional") developer for 15 years, currently doing web frontend, so I am not really interested in the basics like what is variable, function, loop,... how and why do unit testing, what is build process, what are design patterns etc... I would rather focus on specifics of Swift and Xcode and of course audio/MID processing.
I have checked some of the Audiobus documentation and started playing around with AudioKit, which also have some interesting tutorials, so I am definitely going to dig into these.
Please, feed me with some goodness
AudioKit is absolutely a good place to look. I haven't developed anything with it myself, but I have looked at the demos and samples and read the docs. It could be very useful if it fits with what you want to develop.
There is actually lots of information on the web for working in audio, but wading through it and figuring out where to start is not an easy task. A bit more information about what you want to do might be really helpful in pointing you at useful resources. Also, knowing about your math and engineering background could be helpful.
Adding to the Midi section:
These guys have a nice framework/library (used by a lot of 'big selling' developers for midi tasks, so there must be a reason... and saves you from reinventing the wheel)
Thanks, I've heard many times about JUCE and it looks very powerful, but not sure if it wouldn't be overwhelming for the start. Also, I would like to focus only on iOS (and eventually MacOS if Marzipan proves like a way to go), to keep it simple, so I wouldn't use JUCE's biggest advantage - multiplatform support.
My current plan for an "MVP" app is to create a simple looper grid (X = channels, Y = scenes, cells are playable/stoppable buttons), ideally able to trigger both audio and MIDI. That would already require quite a lot of things to embrace and implement - transport, tempo, time stretching, mixer, MIDI (only recording from external source, still I believe it's a huge challenge) and of course, the whole GUI, layout, etc... I would like to first make it Audiobus compatible (and of course Core MIDI compatible), IAA is probably not a way to go now.
I am not interested in building a synth or sound effect , this is why I would like to start with using AudioKit (maybe other framework/s) as much as possible. I don't want to have control over everything - rather the opposite. Just be able to get a working app the easiest (but still reasonable) way possible.
As for engineering background, I haven't been programming in C nor C++, I have tried objective C long time ago to make a simple iOS app (around iOS 5-6?) but don't remember anything
But I have started learning Swift, it looks very similar to TypeScript I use on a daily basis and I would love to stick to it as much as possible as I like it much more than Objective C
Keep in mind that for audio work you'll need your code to be "realtime safe". This means in practice that any DSP/MIDI stuff that happens on the realtime-thread needs to be written in C (or carefully checked C++) code. High-level languages like Swift, Objective-C or the convenient parts of C++ (hello vectors!) are essentially a big no-no.
I would definitely agree with your starting point of using AudioKit. For one thing, it's going to allow you to do your project in Swift and maintain the critical path that @brambos points out of keeping the audio thread realtime safe (and this really is the most important thing in an audio setting).
Since you do want the possibility of putting your application on macOS eventually, SwiftUI would probably be the easiest thing to use to do that. SwiftUI also looks to me like a great way to do UI development in general. The only problem with that is that it is still beta (for a few months) and I don't have any feel for how it is going to mix with AudioKit. I'd try to see if anyone in the AudioKit world could give you pointers on this and start learning SwiftUI if it looks like it is going to be possible in the near term. If SwiftUI doesn't look like it's going to work well with AudioKit any time soon, then UIKit will still get you where you want to go.
I'd suggest that you do still want to learn C and C++ because eventually you are likely going to need to dig in a bit deeper and extend or alter the functionality of something in AudioKit. I think it's best to actually learn straight C first. I do all of my DSP stuff in a very small subset of C++ (basically C with a bit of C++ polymorphism) and understanding C and how it works is still the most important aspect of this. I think you'll be able to get a long way using AudioKit without getting into C/C++ (maybe all the way), but it's still good to be versed in C even if all you need it for is understanding how something is working when you need to read the AudioKit code.
Thanks a lot, this was very interesting read, I am still slowly revealing all of these specifics of audio development and such articles helps a lot!
And yes, as @Antkn33 said, it would be extremely valuable, if you'd share some of your wisdom about making iOS apps. I am still amazed about your skills in both UX and development. Your apps are super solid, Rozeta suite was AFAIK the first AU MIDI plugin out there and all you apps look and feel amazing.
@NeonSilicon said:
I would definitely agree with your starting point of using AudioKit. For one thing, it's going to allow you to do your project in Swift and maintain the critical path that @brambos points out of keeping the audio thread realtime safe (and this really is the most important thing in an audio setting).
Thanks for your words, actually this is exactly my thinking right now - I have already went trough the Apple's tutorial for SwiftUI and it's awesome, it reminds me a lot React from javascript world I use in my day job. I have already tried to use it in my test app together with AudioKit but as you say - it's not currently possible, SwiftUI requires Swift 5.1, AudioKit is currently running on 5.0, but it looks like guys are aware and they are going to update it. Still not sure if they'll mix together well, so currently I am just working on my view layer in UIKit.
c80f0f1006