AudioKit <-> Main thread async communication

375 views
Skip to first unread message

ramon...@gmail.com

unread,
Jun 15, 2017, 3:50:23 PM6/15/17
to AudioKit Users Group
Hi there,

Started not long ago with AudioKit, coming from The Amazing Audio Engine, I was wondering if there is any better way to communicate with the main thread than Dispatch.async.
In a former project I used TPCircularBuffer to message the main thread. I see it included in the internals of AudioKit, but not exposed anywhere.

Any hints?

Dave O'Neill

unread,
Jun 15, 2017, 9:19:34 PM6/15/17
to AudioKit Users Group
One of the biggest differences I see from the two frameworks is a complete separation from work being done on the render threads.  It seems like the convention for dealing with the render thread is to create an audio unit that does what you want, then build a Swift/Objective-C interface to it that doesn't have access to any render callbacks.  Is there a specific use case?  Or are you looking for a general solution?

ramon...@gmail.com

unread,
Jun 20, 2017, 7:31:05 PM6/20/17
to AudioKit Users Group
A specific use case is to have graphical events based on a (real-time) sequencer. Think "Guitar hero". Or to have real-time sequence edition and feedback (e.g. notes lighting when played). 

Dave O'Neill

unread,
Jun 20, 2017, 10:16:31 PM6/20/17
to AudioKit Users Group
I have a similar need, installTapOnBus:bufferSize:format:block: only gets called every 100ms which is just too slow.  

My solution is not as general as Michael Tyson's which uses variable sized structs, but works for specific use cases.  The only draw-back is you have to define a single C-struct to carry the data from the render thread.  Just write these structs to a TPCircularBuffer from the render callback, then read and consume the structs from a timer firing on the main thread.  Set your timer's interval to AVAudioSession.sharedInstance().IOBufferDuration and you'll get <= IOBufferDuration * 2 response time.  I created an MultiChannelMixer AVAudioUnit and added a render notify callback to access the render thread, but this isn't the canonical way to do it in AudioKit.  I think we're supposed to subclass AKNode, AKAudioUnit and AKSoundpipeKernel like all of the other effects.

While this technique will get the job done, it would be great to create a more general solution.  If we could call swift functions from the render callback it would be easy, but that's essentially against the rules until Swift's memory management becomes more flexible.  At some point I might try to add a fastTap feature using the same technique detailed above. I would copy the entire audio buffer over with TPCircularBuffer+AudioBufferlist, and just expose unsafe variables in a Swift closure.  You could use AVAudioBuffers on a said fastTap, but that would be a lot of allocations/deallocations so it might not be very efficient.

We'd also have to make a midi version of fastTap to get midi data over to the main thread as well.  Maybe the combination of those two structs would satisfy most user's needs.

I'd love some input on this subject.  Does anyone have any good ideas for a general solution with a Swift interface?

Aurelius Prochazka Ph.D.

unread,
Jun 22, 2017, 1:51:08 PM6/22/17
to AudioKit Users Group
Have you taken a look at the way the new playground for Metronome works?  

let metronome = AKMetronome()


metronome.callback = {

   // stuff you need to have happen regularly goes in here...

}


or the periodic functions:

let performance = AKPeriodicFunction(frequency: playRate) {

  // stuff you need to perform regularly here

}


AudioKit.start(withPeriodicFunctions: performance)


This is demonstrated in the Plucked string playground, and elsewhere.

Different ways to address the issues.  I also like Dave O'Nieill's ideas.  No reason to have all of them available in AudioKit. I'm sure they'll each have their pros and cons, and people just think about things differently and may like one solution more than the other.

Aure


On Thursday, June 15, 2017 at 12:50:23 PM UTC-7, ramon...@gmail.com wrote:

ramon...@gmail.com

unread,
Jun 22, 2017, 2:24:53 PM6/22/17
to AudioKit Users Group
The Amazing Audio Engine had good facilites for this kind of thing via things like:
[_audioController performAsynchronousMessageExchangeWithBlock:...]
The trick was not to do anything obj-c/locks/etc. inside the block. 

I used for example to communicate with our custom made C++ sequencer which ran inside the audio thread for simple operations (e.g. start/stop/fire sequences due to user action).

On TAAE, AEAudioControllerSendAsynchronousMessageToMainThread was the tool I used. It's basically a circular buffer messaging system. Used judiciously you could send back real-time events to the main thread for UI feedback.
We used it for "game" events too, which had to be synced to the triggering of audio. Then you use either a normal timer or display refresh to collect the events on the main thread. 
Reply all
Reply to author
Forward
0 new messages