I have a similar need, installTapOnBus:bufferSize:format:block: only gets called every 100ms which is just too slow.
My solution is not as general as Michael Tyson's which uses variable sized structs, but works for specific use cases. The only draw-back is you have to define a single C-struct to carry the data from the render thread. Just write these structs to a TPCircularBuffer from the render callback, then read and consume the structs from a timer firing on the main thread. Set your timer's interval to AVAudioSession.sharedInstance().IOBufferDuration and you'll get <= IOBufferDuration * 2 response time. I created an MultiChannelMixer AVAudioUnit and added a render notify callback to access the render thread, but this isn't the canonical way to do it in AudioKit. I think we're supposed to subclass AKNode, AKAudioUnit and AKSoundpipeKernel like all of the other effects.
While this technique will get the job done, it would be great to create a more general solution. If we could call swift functions from the render callback it would be easy, but that's essentially against the rules until Swift's memory management becomes more flexible. At some point I might try to add a fastTap feature using the same technique detailed above. I would copy the entire audio buffer over with TPCircularBuffer+AudioBufferlist, and just expose unsafe variables in a Swift closure. You could use AVAudioBuffers on a said fastTap, but that would be a lot of allocations/deallocations so it might not be very efficient.
We'd also have to make a midi version of fastTap to get midi data over to the main thread as well. Maybe the combination of those two structs would satisfy most user's needs.
I'd love some input on this subject. Does anyone have any good ideas for a general solution with a Swift interface?