Hayden from Sonoma Wire Works

653 views
Skip to first unread message

Hayden

unread,
Oct 18, 2011, 4:10:46 PM10/18/11
to Open Music App Collaboration
Hi all,

I appreciate the enthusiasm for doing this. I'm curious as to what you
hope to achieve that's different from Sonoma's AudioCopy/Paste
implementation. Are you talking about this being for MIDI
compatibility only rather than audio being transferred back and forth?
It looks like the conversation started with switching between apps.
That's something that AudioCopy/Paste does for every app that's on the
list already, as you noted since you're talking about using our plist
file as a template.

If you're eager to make something for MIDI, I can understand that and
I'm all for it. A lot of the midi apps are already using AudioCopy/
Paste. I'd be happy to assist you in anyway and provide any input from
our experience with AudioCopy/Paste.

I think doing the MidiCopy/Paste would be awesome, but none of our
apps use MIDI currently, so as you can imagine, we have other things
that have been taken precedence. But if this group wants to steer that
train, Sonoma would gladly assist, host any files, or make
contributions where possible.

For starters, let me explain briefly why the Sonoma SDK isn't open. It
really comes down to the use of the private pasteboard. This may have
changed as iOS has progressed, but it was possible at one time to
essentially sabotage a private pasteboard if you knew the pasteboard's
name. This is why we have an agreement attached to the AudioCopy/Paste
SDK. Also, we did some things that added really awesome features that
wouldn't be possible with the general pasteboard using the Apple audio
identifier. We wanted to make a more refined standard so that apps
could share audio files with certainty, knowing that they would be
compatible with each other. This is why we also limit audio to
44.1kHz-16bit wave files.

So that's why 100% open didn't work for us. We wanted to make sure
that apps that made it onto the compatible list understood these
things and intended to follow along.

Without central control of a list of compatible apps, I don't think
the compatibility for MIDI would be as valuable. That's my input. If
you want to go another way, that's fine with us. But we would love to
be included in the discussion.

Looking forward to seeing what you come up with.

Thanks,
Hayden

nlogmusic (TempoRubato - NLogSynth)

unread,
Oct 18, 2011, 4:38:18 PM10/18/11
to Open Music App Collaboration
Hi Hayden,

thanks for commenting!

Just for some clarity:

We are not talking here about MIDI pasteboards, we are talking in
general about
realtime communication.

This MIDI realtime communication allows App A (like a MIDI Controller
app eg. SoundPrism,
Polychord etc.) to send to App B (like NLog, Sunrizer, etc.) MIDI
event in realtime for
playback much like you do with real MIDI cables.

All this is based on two Apple technologies: Core MIDI & Background
Audio.

We have a clear understanding that the Sonoma SDK in general is NOT
related to this.

However, there is one little aspect which both workflows have in
common:
There is a need for convenient switching between apps.

In discussing this the Sonoma Compatible App List came into my mind.

So, the only question is, wether you request any intellectual
propriety of having
a central stored plist with app metadata or if we may use this idea
also for our
purposes to switch between apps.

Again, we are not talking about you GUI implementation, just the
design pattern.

Feel free to answer in open, or in private by email to myself.

Best
Rolf

Hayden Bursk

unread,
Oct 18, 2011, 4:48:50 PM10/18/11
to open-music-app...@googlegroups.com
Wonderful. That sounds like an excellent idea. I think the compatible_apps.plist is a great starting point for the switching then. You should refer to the current AudioCopy/Paste SDK for an easy example on how to parse parameters that were passed in the URL that launched the app. Sure, you can launch any app with an URL Scheme, like myapp://, but the extra magic would be to pass any extra metadata that you might not be able to pull through coreMIDI, like the app that launched you and its URLScheme, so that you can launch back to it. 

We don't hold any IP in the compatible_apps.plist. Feel free to use it as a template. It's a simple solution. We've since started making a dynamic database that can logged into and manipulated via a web interface, rather than trying to edit a plist file by hand. If any bogus characters get into the file, then the app will fail when it tries to parse it. So you end up having to have a dev server to try out the plist file before pushing it to the live server. Using the property list editor that's in OSX is the easiest way to edit it. Using a text editor can lead to headaches if you use a bad character. 

Hayden

Gabriel Gatzsche (Audanika - SoundPrism)

unread,
Oct 19, 2011, 6:17:06 AM10/19/11
to open-music-app...@googlegroups.com
Hi Hayden, 
that's great!
   Gabriel

Rob Fielding

unread,
Oct 19, 2011, 5:18:32 PM10/19/11
to open-music-app...@googlegroups.com
Excuse me if I display my ignorance about this issue in this forum,
but I have wondered for a long time. :-) What exactly *is* going on
to address actual background recording; and similarly, sending audio
into background effects processors (ie: a stand-alone stomp-box
distortion that runs as its own process in between a sound generating
app and an app that records).

Background MIDI comes very close to addressing this issue; with the
remaining problems mostly being some of the limitations of
representing certain things in MIDI or if the controller happens to
have a really great sound engine that the recording background synth
just can't match.

As I understand it, copy/paste is limited to non-real-time transfers
of audio already recorded; so I have always immediately dismissed it
as a solution to the problem of properly doing audio tracks from an
on-board controller. (Ie: music decidedly not oriented around
recording short loops, but multi-track bouncing very long
improvisations, etc.)

Is iOS5 bringing something new? Something to do with Mach ports?
Something that cannot be discussed here? Or do I have this wrong?

--
http://rrr00bb.blogspot.com

nlogmusic (TempoRubato - NLogSynth)

unread,
Oct 19, 2011, 9:18:38 PM10/19/11
to Open Music App Collaboration
Hi

"Background Audio" is Apple's definition is just one of the few
"justifications"
an app can choose why it needs to continue to run and get CPU
resources
it is put into background i.e. another app gets into foreground.

When Apple devised that they thought primarily about music players
like radio apps etc. So, the background audio app delivers its audio
stream directly to an output device.

There is to my knowledge currently no Apple API to exchange audio
streams between two or more apps in realtime. There is in contrast
such a thing for MIDI data using the virtual port features of Core
MIDI.

Exchanging audio in realtime between apps as you described is however
a feature very much requested from us. So, if Apple does not provide
anything
directly, the question is, if there is anything we can implement which
at least
does not break any Apple rules i.e. get through their review process.

One thing discussed here and even in a former Google group was the
idea
to use socket connections to communicate audio between apps in
realtime.
I think Mach ports are the underlying abstraction Apple uses for
implementing
interprocess communication tools like sockets. There are some things
to
be taken into consideration here: First, it is quite some work to
implement
this reliable and it is not clear, if that would go through Apple's
review process.

An alternative approach which recently came into my mind is this: You
can
view virtual Core MIDI as a structured way of inter-app
communications. It
adresses tasks like finding out about other apps, making connections,
sending
and receiving data etc.

With SysEx you can use Core MIDI for all kind of even non-MIDI
communication
between apps without caring about low level things like sockets, Mach
ports etc.
It is quite obvious, that SysEx'ing via virtual MIDI could be used for
all sort of
semantical info, requests, calls etc. between apps including but not
restricted to
audio pasteboard data.

So, how fast virtual Core MIDI actually is? I do not know, but seems
to be fast
enough for event based realtime communication.

WOULD IT BE FAST ENOUGH FOR REALTIME AUDIO AS WELL?

Don't know, but worth a try. Here you virtual Core MIDI would 'piggy
back' audio
snippets like of output buffer size or smaller.

Why I am optimistic? Well, a few arguments:

- Virtual Core MIDI has proven quite good for MIDI clock even when BPM
goes to
400-600 and potentially more. Each quarter it sends 24 'clocks' that
would give a
clock frequency of 160-240 Hz for the mentioned BPM range. Typical
audio buffer
sizes like 256 or 1024 would have a frequency of 43 rep. 172 Hz. So,
quite similar.
-> Ok, yes, data packet size is much bigger in the audio buffer piggy
back than MIDI
clock.

- If you were the engineer in Apple's team to implement virtual Core
MIDI, what would
you use for transport protocol? Well, probably sockets rep Mach ports
etc. If we are
lucky these guys were clever enough to do it in a way which scales to
the requirements
of sending audio buffers at MIDI clock rate.
-> Ok, just hope, but there is a chance.

- Virtual Core MIDI messages have host time stamps, which are really
useful for sending
audio buffers. It is in fact the same data structure we get in the
audio callback.

In a nutshell: It is worth a try! I would give it a 50% chance to work
well.

Still, it remains the question if piggy backing audio in Virtual Core
MIDI would make
Apple's reviewer suspicious? Could be, yes! But on the other hand,
formally we are not
breaking any rule, since we actually can put into SysEx what we want.
This is probably
less conspicuous than creating own socket connections and fiddling
around with low level
system calls.

Any opinions?

Rob Fielding

unread,
Oct 19, 2011, 9:36:48 PM10/19/11
to open-music-app...@googlegroups.com
I take that answer to firstly mean that AudioCopyPaste isn't more than
what I think it is; it can't be made to to stream audio callback data
in real-time, or write bytes into a never-ending stream, no? I know
many users will complain if I don't support it, but I find
non-real-time audio copy/paste to be a non-solution to the problem of
an ipad controller's audio being recorded into a DAW; and I am not yet
convinced that doing this correctly is impossible.

I bet everyone in this group thinks about abusing MIDI for an
arbitrary byte stream; and I have as well. :-) But then I come to my
senses and think to myself...what a mess! But yes, midi can be abused
into being a subliminal channel. (ie: an alternate way to transport
information, another example is streaming mpeg over dns so that you
can watch internet videos over a hotspot at a cafe even though it
doesn't let you actually connect to its internet; etc, etc.)

Mach ports comes to mind, and I read enough about it existing on iOS
that I have my doubts that it is strictly forbidden. It appears to be
an arbitrary byte stream with unknown latency characteristics,
correct? The ideal thing would be to push renderbuffers into it at
the place where the audio callback normally is. Maybe it's doable but
not standardized? I think the right approach is to keep using MIDI
until it just gets so wierd that it's no longer compatible enough to
justify the complexity of transporting over it; at which point we
investigate OSC. (OSC seems like such a good idea, but you can't do
anything with music over Wifi as that seems to be the current
option... the latency guarantees are just nowhere near where they need
to be for starters.)

1) Can you do Grand Central Dispatch between processes?
2) If so, are there obvious disadvantages to it, like callback
scheduling introducing latency and creating unreasonable minimum
buffer sizes?
3) Is CoreMIDI basically just some trickery with mach ports?

--
http://rrr00bb.blogspot.com

nlogmusic (TempoRubato - NLogSynth)

unread,
Oct 19, 2011, 9:53:05 PM10/19/11
to Open Music App Collaboration
re 1) I believe Grand Central Dispatch is a way to schedule tasks
within
an app to abstract from multi-threading and give iOS and potentially
the dev a better way to optimize it on multi-core CPUs. But there was
some sentences in the doc which said, that for realtime streaming
and other heavy duty use you still better use threading.
Apat from this, GCD to my knowledge isn't a way for inter-app
communication, but 'just' king of concurrency work balancing within an
app
to overcome threading.

re 3) Well, yes, it has to: iOS is based like OS X on the Mach kernel.
And to my knowledge ALL communication tools, APIs and layers are
eventually implemented with Mach ports. But is there any advantage
to go down to Mach ports instead of using BSD sockets? And anyhow
the chances to be accepted by Apple are in my feeling better when
'just' BSD sockets are used, instead of going down to Mach ports. But
even BSD sockets may make them reject an app?

I agree that piggybacking audio via MIDI is a kludge, but it may be
well
received by reviewer since we are not directly tinkering with low
level APIs.
And look, everything including app finding, 'port', connections.
timestamps
etc. are already implemented by the nice Apple engineers. Why not
check if
we can just re-use it?

Probably, better than theoretically discussing it, would be to make a
short
prototype if it works at all ;-)
> ...
>
> read more »

nlogmusic (TempoRubato - NLogSynth)

unread,
Oct 19, 2011, 10:09:07 PM10/19/11
to Open Music App Collaboration
Oh, and BTW we can piggy back OSC via Virtual Core MIDI too
to over come the stupid WiFi latency rates.

On Oct 20, 3:53 am, "nlogmusic (TempoRubato - NLogSynth)"
> ...
>
> read more »

Rob Fielding

unread,
Oct 19, 2011, 10:10:44 PM10/19/11
to open-music-app...@googlegroups.com
Yeah. The unpacking and repacking everything just to transport it
over MIDI because we want a byte stream is wildly idiotic but it would
work for sure
!
:-)

--
http://rrr00bb.blogspot.com

Support (One Red Dog)

unread,
Oct 19, 2011, 10:14:18 PM10/19/11
to open-music-app...@googlegroups.com
GCD is integrated with XPC in Lion, probably not yet in iOS

In the kernel inter-process communication is carried out with mach ports. The user-land API for this is CFMessaegPort. So BSD sockets etc will simply be a stack on top of that. At the basic level a task can hold a reference to a port. The port can either send or receive a queue of messages. A message is a data structure.

On OS X CFMessagePort works inter-process as well as inter-thread. The CFMessagePort is attached to a run-loop and you get a callback on receive. I'm now using CFMessagePort to do inter-thread communication in Arctic rather than the higher level NSNotification and performSelectorOnMainThread. I've yet to submit this to the App Store but it passes Xcode's Validation step.

In theory all you'll need is some way to advertise what processes are available and a memory buffer. Then you can copy the audio data directly between processes. Actually what happens is you create a CFDataRef and send it to the port. You then get a receive callback which copies the data into a new buffer.

Check the sample code for BackgroundExporter. If you can do this in iOS then you can open remote ports and talk between processes. It's very likely that this is how CoreMIDI actually implements the virtual ports. They simply have a dictionary that everyone registers against, each port has a UID and from there at the lower levels it knows what (mach) port to send to. Though this is likely implemented in kernel space in the CoreAudio driver because of tight timing, the IPC is still likely implemented with mach ports.

nlogmusic (TempoRubato - NLogSynth)

unread,
Oct 20, 2011, 2:46:53 AM10/20/11
to Open Music App Collaboration
GCD is actually in iOS since version 4, but again won't help much.

I think a Socket/MachPort based approach would give without question
the required throughput. There are a couple of tasks to be
implemented:

- some init stuff to setup up the basic infrastructure & ports like
MIDIClientCreate, MIDIInputPortCreate, MIDIOutputPortCreate,
MIDIDestinationCreate, MIDISourceCreate
- a way to make running & compatible apps acquaintance with, i.e.
something like MIDIGetNumberOfSources/MIDIGetSource &
MIDIGetNumberOfDestinations/MIDIGetDestination
- a way to initiate & break a connection like MIDIPortConnectSource,
MIDIPortDisconnectSource
- a way to send data like MIDISend (BTW I wouldn't trust MIDISendSysex
here because it sends asynchronously, needs to be researches, but no
problem to send SysEx with MIDISend itself)
- a way to register a callback to receive data like in
MIDIDestinationCreate & MIDIInputPortCreate

Ok, you see where I'm heading too: It's all already in Core MIDI done
for us. All we need to do is to find out if it is reusable or not
scaling up for audio.
Apart from Core MIDI throughput there is also the question of quality
in error handling and timing. In best case most stuff of Core MIDI
runs at kernel
level where we are never allowed to get to i.e. we won't beat Core
MIDI anyhow. Or if they are just using user callable APIs we can do
also.

If somebody has time, we could do an abstraction and then different
implementations: One with sockets, one directly with MachPorts rsp.
via CFMessagePort,
one with Core MIDI piggy backing.

If any Apple engineer is reading this, please comment. Maybe privately
by email ;-)


On Oct 20, 4:14 am, "Support (One Red Dog)" <supp...@onereddog.com.au>
wrote:
> ...
>
> read more »

Michael (A Tasty Pixel - Loopy)

unread,
Nov 10, 2011, 6:21:48 AM11/10/11
to Open Music App Collaboration
Just wondering, has any progress been made on this?

I'm very interested in the idea of (technical limitations
notwithstanding) developing a scheme to transport audio in real time
between apps (an idea I thought was excitingly original, till I found
you guys had beaten me to the punch ;-)). I love the idea of using
apps as effects filters, for example, or, say, recording the output
from one of the guitar amp apps into a live-looper app.

It seems CoreMIDI is quite well placed to transport PCM audio via
SysEx messages, given that the registration and transport systems are
already all there. Its use might be particularly beneficial when the
apps in question are already communicating via Core MIDI (for example,
an app could be sending audio with a delay filter, in time with the
app receiving the audio, via clock messages).

I'm thinking about putting together a couple of test apps to see how
it works, but I thought I'd check first to see if it's been done
already.

On Oct 20, 7:46 am, "nlogmusic (TempoRubato - NLogSynth)"
> > In the kernel inter-process communication is carried out withmachports. The user-land API for this is CFMessaegPort. So BSD sockets etc will simply be a stack on top of that. At the basic level a task can hold a reference to a port. The port can either send or receive a queue of messages. A message is a data structure.
>
> > On OS X CFMessagePort works inter-process as well as inter-thread. The CFMessagePort is attached to a run-loop and you get a callback on receive. I'm now using CFMessagePort to do inter-thread communication in Arctic rather than the higher level NSNotification and performSelectorOnMainThread. I've yet to submit this to the App Store but it passes Xcode's Validation step.
>
> > In theory all you'll need is some way to advertise what processes are available and a memory buffer. Then you can copy the audio data directly between processes. Actually what happens is you create a CFDataRef and send it to the port. You then get a receive callback which copies the data into a new buffer.
>
> > Check the sample code for BackgroundExporter. If you can do this in iOS then you can open remote ports and talk between processes. It's very likely that this is how CoreMIDI actually implements the virtual ports. They simply have a dictionary that everyone registers against, each port has a UID and from there at the lower levels it knows what (mach) port to send to. Though this is likely implemented in kernel space in the CoreAudio driver because of tight timing, the IPC is still likely implemented withmachports.
>
> > On 20/10/2011, at 12:53 PM, nlogmusic (TempoRubato - NLogSynth) wrote:
>
> > > re 1) I believe Grand Central Dispatch is a way to schedule tasks
> > > within
> > > an app to abstract from multi-threading and give iOS and potentially
> > > the dev a better way to optimize it on multi-core CPUs. But there was
> > > some sentences in the doc which said, that for realtime streaming
> > > and other heavy duty use you still better use threading.
> > > Apat from this, GCD to my knowledge isn't a way for inter-app
> > > communication, but 'just' king of concurrency work balancing within an
> > > app
> > > to overcome threading.
>
> > > re 3) Well, yes, it has to: iOS is based like OS X on theMachkernel.
> > > And to my knowledge ALL communication tools, APIs and layers are
> > > eventually implemented withMachports. But is there any advantage
> > > to go down toMachports instead of using BSD sockets? And anyhow
> > > the chances to be accepted by Apple are in my feeling better when
> > > 'just' BSD sockets are used, instead of going down toMachports. But
> > >>Machports comes to mind, and I read enough about it existing on iOS
> > >>> I thinkMachports are the underlying abstraction Apple uses for
> ...
>
> read more »

Rob Fielding

unread,
Nov 10, 2011, 7:03:35 AM11/10/11
to open-music-app...@googlegroups.com, Open Music App Collaboration

I cant hide my disdain for in-app-record and audio-copy-paste, and it would be exciting news to see a proof of concept with this.

IMHO, this would be the only way to actually meet the real requirement to be able to bounce audio tracks correctly. It fits with background midi conceptually as well. My instrument is geared towards long improvisations and, and recording little snippets misses the point completely.

I am currently balking at requests for ACP because that is a lot of work to get only part of what i really want, especially because the ios memory model is to crash you when you are overbooked for memory. I also think it ridiculous that controllers not only embed a synth, but a mini track recorder as well; rather than communicating with apps that do this as their core competence.

Sent from my iPhone, which is why everything is misspelled. http://rfieldin.appspot.com
http://rrr00bb.blogspot.com

Michael Tyson

unread,
Nov 14, 2011, 2:03:13 PM11/14/11
to open-music-app...@googlegroups.com
Righto, it sounds like it's not been done yet, then =)

I'm on it!


--
Michael Tyson | atastypixel.com
A Tasty Pixel: Artisan apps

aim: mikerusselltyson
twitter: MichaelTyson

nlogmusic (TempoRubato - NLogSynth)

unread,
Nov 15, 2011, 3:42:21 AM11/15/11
to Open Music App Collaboration
Hi

there are two use cases I think we discussed:

1. Controller, DAW or Sequencer app sends Synth or Drum app MIDI notes
plus start/stop record. Synth app
does recording and pushed wave data to pasteboard.

For this use case I have a beta which already gave two other apps. If
anyone interested, just email me. I really
like to see this going and it could be released quickly.

2. Realtime audio streaming packages transported via Core MIDI SysEx

I started some prototyping. If anyone else does same or similar thing,
I would be more than happy to
share thoughts, first experiences etc.

Best
Rolf

On Nov 10, 12:21 pm, "Michael (A Tasty Pixel - Loopy)"
> ...
>
> read more »

Michael Tyson

unread,
Nov 15, 2011, 5:42:13 AM11/15/11
to open-music-app...@googlegroups.com
Hi Rolf,

I'd definitely be interested to hear about your impressions thus far, regarding the realtime SysEx transport - how far have you gotten to date? Does it look like it might be viable?

If it does, I'd love to put together a library and invite app developers to start supporting it - I think it could be quite a significant move!

Cheers,
Michael


--
Michael Tyson | atastypixel.com
A Tasty Pixel: Artisan apps

aim: mikerusselltyson
twitter: MichaelTyson

Michael Tyson

unread,
Nov 26, 2011, 5:30:20 PM11/26/11
to open-music-app...@googlegroups.com
I've just put together a library (which I'm calling iOS Audio Pipeline, at present) that implements realtime streaming audio transport over CoreMIDI virtual ports, and I'm seeing quite promising results!

There are a couple tricky bits to overcome, with respect to live audio, but despite that I'm quite confident the system's going to be a big step forward in iOS music app interoperability.

I've setup 3 sample apps that link against the library:

 - One which records from the mic, and sends the audio
 - One which receives the audio and plays it
 - And one which receives audio and passes it straight on, simulating a filter app

I've created a "live mode" for the receiver, which, when detecting more audio in the input queue than it needs to fill the output buffer, skips right to the most recent samples in the buffer. This keeps the latency to a minimum, but introduces the occasional artefacts when the receiver lags behind the sender. More on that in a sec.

To profile the system, I first connected up the recorder, sending audio to the receiver.

Latency of the actual audio pathway with a 0.005s buffer length, measured by comparing the original audio timestamp as received within the render callback, with the current time at the time of playback in the receiver, is about .014s - 0.005s input buffer duration + .004s overhead + 0.005s output buffer duration.

Next, I connected the recorder to the filter, and the filter to the receiver. The latency, recorded the same way, was exactly the same, for the entire 3-app path - 0.014s.

The audio timestamp is sent along with the audio itself, so that that latency can be entirely omitted if the audio destination (Loopy, for instance) is recording live audio being transmitted, which means you could play a synth app, record it from Loopy, and have everything perfectly in time.

That's the good news (and as far as I'm concerned, that's *seriously* good news!)!

The bad news (but I'm nowhere near done exploring yet) is that, in "live mode" (where old samples are dropped to keep latency up), there's a lot of stuttering when doing almost anything on the device, even innocuous things like scrolling in a table view. I have no idea how backgrounding is implemented in iOS, but I suspect that background apps are given a low execution priority, presumably except for the high-priority core audio thread.  This isn't a problem when not playing the audio at the end of the pipe live (for example, when recording in one app a performance from another instrument app, where the audio timestamp is sufficient to record the audio at the right time - I'm very optimistic about this scenario), but for live audio, a solution will need to be found before it's viable.

A side note, not of major importance but of interest: the audio pipeline only appears to operate correctly when sending to an app's virtual destination (that is, selecting the destination app from the source app, and not the other way around).  If connected the other way around (that is, selecting the source app from the destination app), a burst of about 1s of audio is transmitted every 15s or so, give or take, and that's all. Perhaps this is a limitation with the MIDIReceived mechanism?  (Or perhaps I messed something up, I dunno.)  I haven't seen any issues like this when implementing Loopy's MIDI clock sync, so perhaps there's some odd throughput limitation there.

So, I've got more testing to do and a few issues to address, but once it's finished, I'm considering making it available under similar terms to Sonoma's ACP API, then seeking other music app developers to launch with me, if anyone's actually interested - I'm rather excited about the tech, personally speaking!

This is what the interface looks like so far, using the PGMidi wrapper with my additions to make things easier:

@interface APAudioSender : NSObject
- (id)initWithDestinations:(NSArray*)destinations;
- (void)addDestination:(PGMidiDestination*)destination;
- (void)removeDestination:(PGMidiDestination*)destination;
- (BOOL)sendAudio:(const SInt16*)audio length:(NSUInteger)lengthInSamples numberOfChannels:(NSUInteger)channels forTime:(SInt64)hosttime;
@property (nonatomic, readonly) NSArray *destinations;
@end

@interface APAudioReceiver : NSObject <PGMidiSourceDelegate>
- (id)initWithSource:(PGMidiSource*)source;

// Asynchronously receive audio, e.g. from an input callback; 'liveAudio' flag keeps latency down by dropping samples if necessary
- (void)receiveAudio:(SInt16*)audio length:(NSUInteger*)ioLengthInSamples timestamp:(uint64_t*)timestamp;
- (void)receiveAudio:(SInt16*)audio length:(NSUInteger*)ioLengthInSamples timestamp:(uint64_t*)timestamp isLive:(BOOL)liveAudio;

@property (nonatomic, retain) PGMidiSource* source;
@property (nonatomic, retain) id<APAudioReceiverDelegate> delegate; // Set delegate to receive audio synchronously
@property (nonatomic, readonly) NSUInteger numberOfIncomingChannels;
@end

@protocol APAudioReceiverDelegate <NSObject>
- (void)audioReceiver:(APAudioReceiver*)receiver didReceiveAudio:(SInt16*)audio length:(NSUInteger)lengthInSamples atTime:(SInt64)hosttime;
@end










-- 
Michael Tyson | atastypixel.com
A Tasty Pixel: Artisan apps

Loopy HD has been released! Savvy, tactile live looping on the iPad.

Find us on Facebook, and Twitter
Subscribe to our newsletter

aim: mikerusselltyson
twitter: MichaelTyson

Support (One Red Dog)

unread,
Nov 27, 2011, 7:46:12 AM11/27/11
to open-music-app...@googlegroups.com

>
> That's the good news (and as far as I'm concerned, that's *seriously* good news!)!
>

Very cool!

> The bad news (but I'm nowhere near done exploring yet) is that, in "live mode" (where old samples are dropped to keep latency up), there's a lot of stuttering when doing almost anything on the device, even innocuous things like scrolling in a table view. I have no idea how backgrounding is implemented in iOS, but I suspect that background apps are given a low execution priority, presumably except for the high-priority core audio thread. This isn't a problem when not playing the audio at the end of the pipe live (for example, when recording in one app a performance from another instrument app, where the audio timestamp is sufficient to record the audio at the right time - I'm very optimistic about this scenario), but for live audio, a solution will need to be found before it's viable.
>

IMHO virtual MIDI ports are implemented on top of mach_ports (I use these to pass messages between the audio thread and UI main thread) and the memory buffers are malloc'ed. I think you'll have to buffer the audio at the receiving thread (something like CARingBuffer) and then pass it off to the audio render proc. If you're copying the audio from the MIDI receive callback straight into the audio render proc, I think you'll have dropped audio. I'm also not sure what would happen if you overly delay the MIDIReadProc. Of course if you're already doing this then please disregard my thread hijack.

AUHAL and aggregate devices would be ideal

cheers
peter

Michael Tyson

unread,
Nov 27, 2011, 9:25:32 AM11/27/11
to open-music-app...@googlegroups.com
Cheers, Peter =)

Actually, I already am using a ring buffer to store the incoming audio, which is then drained by the render thread. It occurs to me that my prior theory is totally wrong, though - the holdup isn't at the sender's end, it's on the receiver, as it has to skip buffers to keep the latency low. Turning live mode off to avoid skipping buffers prevents the glitching, but causes big latency issues.

I'm just tweaking the receiver now, trying to figure out where the bottleneck is; I was using a GCD queue to do the processing of the incoming MIDI packets, which might be lagging behind when things get busy. I'm getting rid of that double-handling and moving the processing straight into the buffer drain routine, which may help.

Michael Tyson

unread,
Nov 27, 2011, 1:39:31 PM11/27/11
to open-music-app...@googlegroups.com
Lovely - that's totally fixed it!  Live audio now coming through smoothly and with nice low latency.

Support (One Red Dog)

unread,
Nov 27, 2011, 3:36:46 PM11/27/11
to open-music-app...@googlegroups.com

On 28/11/2011, at 5:39 AM, Michael Tyson wrote:

Lovely - that's totally fixed it!  Live audio now coming through smoothly and with nice low latency.


I'm just tweaking the receiver now, trying to figure out where the bottleneck is; I was using a GCD queue to do the processing of the incoming MIDI packets, which might be lagging behind when things get busy. I'm getting rid of that double-handling and moving the processing straight into the buffer drain routine, which may help.


Awesome. Yeah GCD isn't real-time, so a thread context switch, lock, malloc, or some other system call to manage the queue may cause the audio render proc to miss the deadline.

Another option may be to use <sys/shm.h> and shmget() 

nlogmusic (TempoRubato - NLogSynth)

unread,
Nov 28, 2011, 3:45:00 AM11/28/11
to Open Music App Collaboration
That's great news!

If there is anything we can test, just let us know!

Also, it would be interesting to check with Apple review process if
they are getting against it. So, before
doing too much functional development on it, it might be wise to put
something early in review to check out.
You can even hold the app store release back.

Anyway, cheers to this great stuff!

Christopher Randall

unread,
Nov 28, 2011, 9:58:12 AM11/28/11
to open-music-app...@googlegroups.com
I was going to suggest that very thing. This is just the sort of thing that Apple would get their panties in a twist about. Best to find out sooner rather than later if that's gonna be the case.

Chris Randall
Audio Damage, Inc.
http://www.audiodamage.com

Ben [Camel Audio]

unread,
Nov 28, 2011, 12:38:39 PM11/28/11
to Open Music App Collaboration
Hello everyone - I've just joined the list. A big thank you to those
of you involved with developing Virtual MIDI (which we support in
Alchemy Mobile). Thanks and congrats to Michael for this work on
getting live audio streaming proof of concept working! If Apple are
OK with doing live audio streaming in this way (and I agree with Rolf
and Chris that submitting a trial App to the App store is a good way
to test), we'd love to support this in a future Alchemy update.

Cheers
Ben

Michael Tyson

unread,
Nov 28, 2011, 1:44:37 PM11/28/11
to open-music-app...@googlegroups.com
A very good idea! I'll see what I can do.

Does anyone know if there's a way to actually directly ask the review team questions (like, "is this okay"), or do we truly have to go through the entire app submission pantomime to test acceptability? If so, it seems rather inefficient =)


--
Michael Tyson | atastypixel.com
A Tasty Pixel: Artisan apps

aim: mikerusselltyson
twitter: MichaelTyson

Christopher Randall

unread,
Nov 28, 2011, 1:48:40 PM11/28/11
to open-music-app...@googlegroups.com
I would speak to Sebastion @ Audenika. He has more experience with getting gummed up in the App Review process than any of us, I would wager. ;-) I personally am 7 for 7 with initial submissions getting accepted (knock on wood), so I've never had reason to ask anything, and thus have no contact info.

Chris Randall
Audio Damage, Inc.
http://www.audiodamage.com

Jesse Chappell

unread,
Nov 29, 2011, 9:14:53 AM11/29/11
to open-music-app...@googlegroups.com
I wouldn't attempt to ask permission, this sounds like the kind of
gray area that some of their technical staff would probably say is
against the rules. I'm guessing that even our use of virtual midi for
midi would throw up some flags, if they really paid attention to it.
Better to preemptively win by community acceptance than to ask
permission.

It would be great if you put the code up into a git repository somewhere!

jlc

Sebastian Dittmann

unread,
Nov 29, 2011, 9:51:40 AM11/29/11
to open-music-app...@googlegroups.com
I have to admit that I do not fully grasp the technical hurdles that have to be overcome for this but I am willing to contact my totally secret sources within Apple to figure out what chances this has to go through.

Is the way this is going to be accomplished going to involve any closed source third party libraries? How's the behavior of apps running in the background going to be changed? These are the questions I'd probably have to address.

My skype nick is sebastian.dittmann - if Michael (or anyone else) wants to contact me about this.

Best,

Sebastian

Michael Tyson

unread,
Nov 29, 2011, 10:42:08 AM11/29/11
to open-music-app...@googlegroups.com
Hey Jesse =)

Do you really think so? It all seems eminently innocuous to me - but it probably can't hurt to be a little cautious.

I'll most definitely git this baby up, in a couple days (or possibly sooner) once I've ironed out some more kinks.

Michael Tyson

unread,
Nov 29, 2011, 10:42:12 AM11/29/11
to open-music-app...@googlegroups.com
That's a very kind offer, Sebastian! If they're the friendly, discretion-having kind of secret sources, that would be most helpful.

There'll be no closed source third party things - it's going to be an open library (hosted on GitHub), which will build as a static lib that can be included in the host project. It includes PGMidi (with a few of my own improvements), and a couple of classes (APAudioSender and APAudioReceiver), which make calls to the PGMidi interface (which just uses the standard Core MIDI API, nothing extra).

As for as background behaviour goes, it won't be any different to the way that apps (like MoDrum, Bassline, soon-to-be Loopy, etc) with MIDI sync work. The only potentially funny part would be apps that *only* act as an audio filter for other apps (and don't actually create or playback audio themselves), as they need to (arguably spuriously) request background audio, and keep an active audio session in order to continue to be run in the background. This *may* be problematic.

If you don't mind waiting a couple of days for me to pop the source up on GitHub along with a couple of sample apps, then it could be scrutinised directly for kosher-ness.

I really don't think there'll be any issues with it, for the most part - I'm not doing anything at all unusual, or outside the public API, and as far as transporting audio data over SysEx messages goes - that's what SysEx was *designed* for, among other things =)

Christopher Randall

unread,
Nov 29, 2011, 10:57:07 AM11/29/11
to open-music-app...@googlegroups.com
My upcoming sequencer app doesn't generate any audio at all; it is only a MIDI sequencer. However, I use a CoreAudio record/playback loop for timing since this is the only way to get a rock solid clock on an iOS device that doesn't get superseded by UI events, best I can tell. Will my app affect this system at all? Or will it affect my app?

I do know that the CoreAudio implementation makes my app something of a CPU pig, but I can still drive two synths on an iPad 1, as long as one of them isn't AniMoog. Being a Pro Synth really takes a lot of CPU, I guess. (NLog is fine if the effects are off, as one might expect.)

Also, Michael, that link you gave me the other day for your altered PGMidi thing, it unzips in to the dreaded CGPZ loop, and none of the normal tricks can get it out of the loop. (I even tried a CGPZ utility on my PC, and it says the archive is corrupted.)

Chris Randall
Audio Damage, Inc.
http://www.audiodamage.com

Michael Tyson

unread,
Nov 29, 2011, 12:33:45 PM11/29/11
to open-music-app...@googlegroups.com
Oh, that's interesting - if Apple's okay with that, they're probably going to be okay with a filter app doing the same thing.

No, no impact on your app unless you use the library =)

Whoops - sorry about that. My dodgy wifi connection died while I was uploading it and I couldn't get it back. Try again now: http://resources.atastypixel.com/PGMidi+TPAdditions.zip

Rob Fielding

unread,
Nov 29, 2011, 1:31:57 PM11/29/11
to open-music-app...@googlegroups.com
When this idea was first being kicked around, it seemed like it was
just a crazy abuse of available resources. Now I have come around to
the opposite idea; which Apple would approve of if they viewed it this
way:

When describing it to Apple, it's probably best to portray this as
simply audio sharing with *very* excellent metadata about the audio.
If you are going to shoot audio between apps, then it's of negligible
cost to also embed a MIDI transcript of what the audio is.

I'm currently working on a very portable (ie: no external references
at all) high-level fretless MIDI API that is essentially a MIDI stream
generator. It actually would make a whole lot of sense if it works by
submitting audio buffers interleaved with MIDI messages. You can
dispense with attempts to analyze the signal in a lot of cases if you
are simply given a high level description along with it.

--
http://rrr00bb.blogspot.com

Dave [WhiteNoiseAudio]

unread,
Nov 29, 2011, 4:04:15 PM11/29/11
to Open Music App Collaboration
Chris, I would recommend seperating your MIDI stuff to run in a
seperate thread. I think I read somewhere that doing MIDI stuff in the
audio thread is a no-no.

On Nov 29, 10:57 am, Christopher Randall <ch...@audiodamage.com>
wrote:


> My upcoming sequencer app doesn't generate any audio at all; it is only a MIDI sequencer. However, I use a CoreAudio record/playback loop for timing since this is the only way to get a rock solid clock on an iOS device that doesn't get superseded by UI events, best I can tell. Will my app affect this system at all? Or will it affect my app?
>
> I do know that the CoreAudio implementation makes my app something of a CPU pig, but I can still drive two synths on an iPad 1, as long as one of them isn't AniMoog. Being a Pro Synth really takes a lot of CPU, I guess. (NLog is fine if the effects are off, as one might expect.)
>
> Also, Michael, that link you gave me the other day for your altered PGMidi thing, it unzips in to the dreaded CGPZ loop, and none of the normal tricks can get it out of the loop. (I even tried a CGPZ utility on my PC, and it says the archive is corrupted.)
>
> Chris Randall

> Audio Damage, Inc.http://www.audiodamage.com

Christopher Randall

unread,
Nov 29, 2011, 4:06:58 PM11/29/11
to open-music-app...@googlegroups.com
Why on earth would that be the case. Obviously, you don't want to make CoreMidi calls directly in an audio thread. That'd be stupid. But there's no particular reason not to tell another process to send notes or something.

Chris Randall
Audio Damage, Inc.
http://www.audiodamage.com

Sebastian Dittmann

unread,
Nov 30, 2011, 9:23:00 AM11/30/11
to open-music-app...@googlegroups.com
Just an update:

I should know in about a week what Apple says about this audio pipeline library. I can't 100% guarantee that App Review is going to approve it because nobody knows what App Review is doing these days but at least we'll have some sort of clarification from within the company regarding if they like it or not. :)

Best,

Sebastian

Sebastian Dittmann

unread,
Dec 2, 2011, 5:30:04 AM12/2/11
to open-music-app...@googlegroups.com
I've gotten a message back from my sources within Apple and it looks like this could actually work and be approved. No guarantee though but definitely not a clear NO. :)

nlogmusic (TempoRubato - NLogSynth)

unread,
Dec 2, 2011, 5:52:52 AM12/2/11
to Open Music App Collaboration
Cheers! That's getting big!

Michael Tyson

unread,
Dec 2, 2011, 6:34:46 AM12/2/11
to open-music-app...@googlegroups.com
Nice, cheers, Sebastian =)
Reply all
Reply to author
Forward
0 new messages