OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI

Showing 1-12 of 12 messages
OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI nlogmusic (TempoRubato - NLogSynth) 11/3/11 12:47 AM
Hi

I like to sum up here what iOS DAWs need to do to have fun with synth
apps via virtual Core MIDI. FL Studio & MusicStudio already
implemented some part of it. I am not sure wether this was
intentionally or a by product, since there intention war probably more
about
real external hardware control as their UI naming suggests, but it
works also for virtual Core MIDI synth apps running in the
background.
This video shows it all:

http://www.youtube.com/watch?v=IRxfLOdS1HE

Ok, here comes the list.

MUST HAVE FEATURES:

1. Implement a MIDI Out option for your instrument tracks
2. Provide a way to mute or at least silence your own virtual
instrument
3. Provide a Core MIDI control panel to activate MIDI Out
4. Dynamically update when new Core MIDI devices found
5. Provide a way to map different tracks sending MIDI Out to different
MIDI devices either by MIDI channel selection or MIDI device
assignment per track
6. Send MIDI Note Off messages when your DAW transport stops
7. Implement background audio for your DAW that it keeps running in
background when switching to the synth apps

GOING FURTHER AND BECOME BEST OF CLASS:

8. Implement a pure MIDI track type (instead of silence your own
instruments, saves CPU)
9. Send MIDI transport messages Start, Stop & Continue
10. Send MIDI Song Position Pointer messages
11. Have an option to send MIDI clock messages for sync'ed devices
12. Send MIDI Vol & Pan CCs when user operates Vol & Pan controls in
your UI
13. Implement further automation which can be mapped to MIDI CCs and
send to the synth apps for filter sweeps etc.
14. Have a little MIDI Program select panel per track, save setting in
your project file and send it to the synth apps

BE INNOVATIVE AND DO THE INTERSTELLAR iOS DAW:

15. Implement automatic audio copy & paste for track freezing and
final audio mix down:

This is described here in detail
http://groups.google.com/group/open-music-app-collaboration/browse_thread/thread/280a88fd25c17e6c
further down in the discussion

15. Engage in the OMAC group here, bring your own ideas, have fun!


Cheers
Rolf
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Paul Slocum (SOFTOFT TECHECH) 11/3/11 7:35 AM
Which ones do currently support all or some of these features?

-paul

On Nov 3, 3:47 am, "nlogmusic (TempoRubato - NLogSynth)"
> This is described here in detailhttp://groups.google.com/group/open-music-app-collaboration/browse_th...
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Paul Slocum (SOFTOFT TECHECH) 11/3/11 10:15 AM
Anyone tried virtual midi w/ Sample Lab or Genome Sequencer?  I've
tried Nanostudio and Little MIDI Machine, and neither of them appear
to support virtual midi.

-paul

On Nov 3, 10:35 am, "Paul Slocum (SOFTOFT TECHECH)"
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Aaron Pride (Sample Lab) 11/3/11 10:26 AM

Sample Lab supports most of what Rolf mentioned.  Open the config window to toggle virtual midi output, and channels and octaves for each track.  Midi was added after the initial design so it does not have all the midi stuff I would like (CC,chords etc.) But sends clock sync, note on/off/velocity...just the basics.  Devs that want a promo code for testing, shoot me an email at sup...@samplelabapp.com.

Sample Lab only does virtual midi output, so we do not explicitly create a port, just connecting to other apps open ports.  Wanted to avoid the finger method of having the user connect everything app-by-app...just picking MIDI channels seems cleaner and more consitent with traditional MIDI hardware use.

@ Rolf:  what midi signals are you expecting for
-start recording (MMC start?)
-stop recording (MMC stop?)
-copy recording to clipboard
We have another update to release and it should be pretty easy to support auto c&p bounce.

Also, is this in your most recent update?  Appreciate the channel selection option...most people are not supporting this yet.

Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Amos 11/3/11 10:49 AM
Genome has a very rich MIDI setup section and transmits MIDI to/from other apps on the same iPad rather nicely.  The word 'virtual' doesn't appear, but I think it's doing what you're asking...
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Nic G (Audeonic Apps) 11/3/11 11:03 AM
<WARNING: utterly_shameless_plug>

If an app supports CoreMIDI network port then it can also be used with
virtual MIDI:

http://j.mp/rVKpfh

(link goes to iDesignSound.com - see comments section)

Regards, Nic.

Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI nlogmusic (TempoRubato - NLogSynth) 11/3/11 11:27 AM
@Aaron in regards to the record & pasteboard stuff:

It's not in the public NLog release, but I have a beta I can provide
for testing.
Would be great if you can put it in next update! I am planning to do
this too
still this year for release.

In terms of protocol, actually I haven't thought about using MMC, but
implemented
an own sysex scheme. However, well, I will look into MMC, if it does
the job as well.
Good hint!

The current sysex scheme is described here:
http://groups.google.com/group/open-music-app-collaboration/browse_thread/thread/280a88fd25c17e6c

If you are interested in the beta, just email me your device IDs. I am
preparing
currently already one for Hamilton from MultiTrack DAW


On Nov 3, 6:26 pm, Aaron Pride <aaron.pr...@gmail.com> wrote:
> Sample Lab supports most of what Rolf mentioned.  Open the config window to
> toggle virtual midi output, and channels and octaves for each track.  Midi
> was added after the initial design so it does not have all the midi stuff I
> would like (CC,chords etc.) But sends clock sync, note
> on/off/velocity...just the basics.  Devs that want a promo code for
> testing, shoot me an email at supp...@samplelabapp.com.
>
> Sample Lab only does virtual midi output, so we do not explicitly create a
> port, just connecting to other apps open ports.  Wanted to avoid the finger
> method of having the user connect everything app-by-app...just picking MIDI
> channels seems cleaner and more consitent with traditional MIDI hardware
> use.
>
> @ Rolf:  what midi signals are you expecting for
> -start recording (MMC start?)
> -stop recording (MMC stop?)
> -copy recording to clipboard
> We have another update to release and it should be pretty easy to support
> auto c&p bounce.
>
> Also, is this in your most recent update?  Appreciate the channel selection
> option...most people are not supporting this yet.
>  On Nov 3, 2011 1:15 PM, "Paul Slocum (SOFTOFT TECHECH)" <
>
>
>
>
>
>
>
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Dave [WhiteNoiseAudio] 11/4/11 11:04 AM
Genome is adding support for listening to Song Position / Song
Continue messages in the next version, along with program change
messages. It implements a lot of what you mention already, more stuff
like Pan / Volume in a mixer type interface is stuff I'd like to add
in the future. All this stuff is nice, but right now there aren't a
whole lot of apps that support all these messages yet, so I'm not sure
I would call them 'must have' features.

-Dave


On Nov 3, 3:47 am, "nlogmusic (TempoRubato - NLogSynth)"
<nlogmu...@googlemail.com> wrote:
> This is described here in detailhttp://groups.google.com/group/open-music-app-collaboration/browse_th...
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Dave [WhiteNoiseAudio] 11/4/11 11:11 AM
Genome does support 'virtual midi', however there is an issue with
Genome's output port in the current version (it's not sending MIDI).
So, genome can communicate with any app that has it's own port, but
Genome's port will appear to do nothing. As of now, there are only a
couple apps that don't have their own virtual ports. This is fixed in
the next version which I am finishing up in the next couple days.

On Nov 3, 1:15 pm, "Paul Slocum (SOFTOFT TECHECH)"
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI denis woods 11/6/11 1:52 AM
As with all above suggestions, supporting midi clock in with song
position aswell
as midi clock out is a must have too. Genome will have this which is
great.

Also some kind of universal standard to render internal VIs in ios
DAWs to audio and export   would be on a wish list..
Dropbox is a great option for this. Many pro composers use iPads and
need a simple way to
export audio and midi out to Protools/Logic/ Cubase. Apart from making
my life easier it has to be a revenue
source for developers.

My 2 cents

Denis

On Nov 4, 6:04 pm, "Dave [WhiteNoiseAudio]" <dwalli...@gmail.com>
wrote:
Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI Gabriel Gatzsche (Audanika) 11/7/11 3:50 AM
A iOS DAW should also support the following configuration:

* A synthesizer like NLog runs in the background
* The DAW runs in the background and records midi
* A Midi controller App like SoundPrism or Geo Synth are running in the foreground. They generate midi events that are sent to the synthesizer (audio generation) as well the DAW (midi recording)

 Gabriel

Am 03.11.2011 um 08:47 schrieb nlogmusic (TempoRubato - NLogSynth):

Re: OMAC 2.0: What iOS DAWs need to implement to have fun with synth apps via virtual Core MIDI nlogmusic (TempoRubato - NLogSynth) 11/8/11 5:32 AM
Point 15 is about VI audio rendering into ios pasteboard. We are
working
on some betas currently here. For external export there many options
in addition to Dropbox like SoundCloud etc.

Point 11 was meant with MIDI clock in & out in mind. There are a
couple
of apps already doing this esp. the drum boxes, however DAW-style
app should do it too, or any app with a meaning of 'tempo' like NLog
is receiving clocks for arpeggio sync.