Little Open Letter to MMA

126 views
Skip to first unread message

nlogmusic (TempoRubato - NLogSynth)

unread,
Apr 18, 2012, 2:36:46 AM4/18/12
to open-music-app...@googlegroups.com
Dear MMA,

this is a little open letter from somebody loving the MIDI standard and caring about its future. Please excuse any provocative and rant-like statements, it's early morning here and I had just my morning coffee.

There was just a discussion about SysEx in this group which I like to take as motivation to write down some thoughts about MIDI and the MMA. Ironically enough, this group mostly centered about virtual MIDI functionality in iOS apps was started by myself after I tried very naively to contact the MMA. My thought was - without having any experience of the MMA - that the MMA would be interested in and care about exciting future developments and usage of MIDI on iOS devices. So, I contacted somebody I knew from a hardware company which had some role in the MMA. Since I never got an answer, it became a key reason to start this group and wrote the little manifesto. Happily enough it went out well and served my purposes.

Still, the question of "What's the future of MIDI?" is still around. Or, to put it a bit smaller but more provocative: "Is there any real advancement of MIDI apart from defining ringtone standards?"

Look, ringtone standards meant nothing to MIDI. But if you are looking into the sales figures on MIDI interfaces for all kind of platforms, it becomes very clear which is the currently driving platform for MIDI: iOS! I have no idea what the MMA is discussing internally, so I may be completely wrong. But even the fact that, if there is a discussion about future MIDI in MMA (if any), and this discussion is purely internal, just is the wrong approach because innovation most of the time comes from outside.

I am very happy and honored that some from the MMA have eventually joined the group and following our small group here. However, still in my naiveness I believe the MMA should care about the future of MIDI. IMHO this future is not coming from traditional hardware manufacturers and I am very aware that small scale iOS developers are aliens in the world of musical instrument industries. We are even not an industry. And guess what, I like this fact.

Take this little example: Applying for a manufacturer ID from MMA is 200 USD, oh not just once, every year! For a well founded hardware manufacturer this is just the average cost of an internal one hour business meeting with 3 people and unclear result. For an indie iOS developer 200 USD yearly means a lot. But for what value? IDs are needed, but even Apple does not charge you for applying for an Audio Unit vendor ID. It's just free.

I'm fine if MMA decides just to restrict its role just to keep the MIDI standard which we all love and respect for what it has been many years. We will and already did find other ways to innovate. But naively as I am, I like to reach out my hand to MMA for finding a way to discuss more seriously and effective possible future advancements of MIDI. And I am sure that others from this group would be interested to put in their input and help also. We may be indie companies, but not hobbyists. There will be soon (if not already there) the point where collectively more MIDI messages sent on iOS devices than ever on hardware gear. The 'industry' is changing.

Best regards and greetings from the iOS world
Rolf

Gabriel Gatzsche

unread,
Apr 18, 2012, 3:23:17 AM4/18/12
to open-music-app...@googlegroups.com
Hi Rolf,
I agree with you in all points. From your mail I read two main issues:

1.) The MMA-Standardiszation activities have to be reconsidered to target the development in the iOS world.
2.) The process of "manufacturer ID" provisioning must be made more easy and IOS developers should not be charged of using the IDs.

Regarding the second point I propose that the MMA provides one Manufacturer-ID that has to be used by all iOS-Apps.
Additionally the Sys-Ex-Standard is extended for iOS-Devices: Beside the iOS-Sysex-Manufacturer-ID an unique Application-ID is transmitted. The App-ID is generated using the mechanisms provide by iOS.

Best,
Gabriel

sup...@onereddog.com.au

unread,
Apr 18, 2012, 4:47:58 AM4/18/12
to open-music-app...@googlegroups.com
Hi all

> 1.) The MMA-Standardiszation activities have to be reconsidered to target the development in the iOS world.

I'd really like to know what the MMA standardisation activities are. I mean, the MIDI Spec hasn't changed for years, are there new things in the pipeline? Would you have to some how join the MMA to actually get access to this information? In what way can we influence the MIDI Spec towards iOS, if indeed that needs to be the case?

> 2.) The process of "manufacturer ID" provisioning must be made more easy and IOS developers should not be charged of using the IDs.
>
> Regarding the second point I propose that the MMA provides one Manufacturer-ID that has to be used by all iOS-Apps.
> Additionally the Sys-Ex-Standard is extended for iOS-Devices: Beside the iOS-Sysex-Manufacturer-ID an unique Application-ID is transmitted. The App-ID is generated using the mechanisms provide by iOS.

Why not use the Emagic manufacturer ID which I think is 0x00, 0x20, 0x31, or possibly the Apple manufacturer ID. If you then simply add the App Bundle ID as the first string in Pascal format (i.e. length + characters), you can determine that the SysEx is for your app. You still have to test the first few bytes but at least you don't have to then examine the full data, if it's not the Emgic or "iOS" ID you can at least skip to the 0xF7. If it is, and you're handling SysEx you still have to extract and do a string compare (true 7-bit ASCII is your friend). Even with TCP/IP (OSC is typically implemented on top of TCP) something somewhere in the stack has to do a test. That's fine on reception, the opposite is not the case if you then send your SysEx to another Emagic/Apple/iOS device, be it external or inter-app. You are drawing the assumption the receiver is following the same rules. With other protocols like TCPI/IP that is implied in the stack but in MIDI it is not. Hence the need for a unique manufacturer ID.

cheers
peter

Rob.fielding

unread,
Apr 18, 2012, 9:32:08 AM4/18/12
to open-music-app...@googlegroups.com, open-music-app...@googlegroups.com
i second the idea that we are critical because we care about it. i would like to add these general thoughts:

1) the bend, note-tie, and channeling hacks required for correct pitch handling were not a big deal when everything was a discrete key hardware keyboard. the problem is critical on a touchscreen.

2) in keeping with the opengl analogy, the synthesizer end of things will end up with a shader language so that patch exchange between apps can work well. that language will probably be Pd or perhaps CSound. if there were only a few engines that took libpd/csound patches and understood non-trivial midi, then they would free up controller developer to focus on what they do best.

3) it is crazy that every developer of every app on ios writes: controller, timbre generator, signal processor, mini-daw. midi is supposed to make it feasible that i can make a great midi controller app, and things are so easy to setup for the end user that it it not required to also write an internal sound engine without a great reason. i had to write an internal engine because i had to do so many hacks over the midi protocol to get correct pitch/poly/legato handling that only a few synths can be made to "just work". we get comments in our ratings about how our app is broke because common omni behavoir is to simply substitute zero over all channels (breaks pitch bending). setting bend width and making a span of identical patches to accomodate it is much like vcr programming. i cant tell users how to do it, because it is different on every device, and they cant figure it out from their own docs. (i have a korg karma, on which the setup is doable, but ridiculous).

4) something needs to be done about this setup pain. each side needs to announce what they are capable of, and exchange info on what they decided to do. this is why exchanging specific vendor and device ids is a bad idea.

it sounds like a totally new protocol. but being a fork of midi is better than abandoning it entirely, assuming that it degrades gracefully (at least the pitches come out correct, etc).

Tom White (MMA)

unread,
Apr 18, 2012, 3:54:14 PM4/18/12
to open-music-app...@googlegroups.com
Hi Rolf,

I'm responding to your Open Letter out-of-sequence because
I didn't actually get the original for some reason (maybe
it went to my Junk folder <g>) and it took me a while to get
through all the emails I did get and come back to yours...

> I like to take as motivation to write down some thoughts
> about MIDI and the MMA.

I welcome that...

> Ironically enough, this group mostly
> centered about virtual MIDI functionality in iOS apps was
> started by myself after I tried very naively to contact the
> MMA. My thought was - without having any experience of the
> MMA - that the MMA would be interested in and care about
> exciting future developments and usage of MIDI on iOS
> devices.

We do...

> So, I contacted somebody I knew from a hardware
> company which had some role in the MMA.
> Since I never got an answer, it became a key reason to start
> this group and wrote the little manifesto. Happily enough
> it went out well and served my purposes.

Interesting... I was not aware that you contacted someone and
did not get an answer.

Of course, whether you were correct to expect and answer from
that person depends on who it is; and why you didn't get an
answer could be for any number of reasons other than "they
didn't care" (which I'm inferring is what you concluded).

In fact, the reason I joined this group is that I was told
about it by MMA members who thought it might have valuable
input for MMA...

So I apologize for you having trouble reaching MMA.

> "Is there any real advancement of MIDI apart
> from defining ringtone standards?"

Our mission is to keep MIDI viable by helping companies
determine how to be interoperable and by supporting new
applications and markets. Those two objectives can become
conflicting when new markets require using MIDI differently
than before, which makes our job challenging. I think we
have done a good job advancing MIDI while maintaining backward
compatibility.

Regarding ringtones, they were never expected to be the
"future of MIDI"... we had members who wanted to do business
in that market, and everyone felt it would be more efficient
if that industry didn't create something that was incompatible
with MIDI. The ubiquity of MIDI is the number one benefit of
MIDI, and maintaining that is MMA's job.

> it becomes very clear which is the
> currently driving platform for MIDI: iOS! I have no idea what
> the MMA is discussing internally, so I may be completely
> wrong.

Many people (companies, actually, but represented by people) in
MMA believe iOS is an important platform, but I think most of
them also think it already works correctly, which means there's
no reason for any discussion of iOS in MMA. However, MMA members
do realize that a lot of the iOS MIDI usage is from non-members,
which is why I joined this group (and others)...

> if there is a discussion about
> future MIDI in MMA (if any), and this discussion is purely
> internal, just is the wrong approach because innovation most
> of the time comes from outside.

Such discussions could occur outside of MMA, but most companies
are not interested in discussing such topics in public, where
they are highly exposed, and so will limit their comments or not
participate at all. MMA provides a better forum for discussion
of sensitive things like product ideas and visions, at least
for large companies with a lot to lose.

Also, the details of what the companies will recommend should
not be revealed until the agreements are confirmed, otherwise
there can be premature and incorrect usage, misunderstandings,
bad press, etc. ... which is bad for all MIDI users. So this
is avoided by keeping discussions internal to MMA.

> I am very aware
> that small scale iOS developers are aliens in the world of
> musical instrument industries. We are even not an industry.
> And guess what, I like this fact.

From MMA perspective, all users (implementers) of MIDI are part
of the "MIDI industry", which is not the same as the "musical
instrument industry"... many of our members are in cellphone
or stage or other industries.



> Take this little example: Applying for a manufacturer ID
> from MMA is 200 USD, oh not just once, every year! For a well
> founded hardware manufacturer this is just the average cost
> of an internal one hour business meeting with 3 people and
> unclear result. For an indie iOS developer 200 USD yearly
> means a lot.

We have considered lowering the price for iOS developers. With
membership dues the charge is based on revenue... Maybe we can
do the same for IDs.

> But for what value? IDs are needed, but even
> Apple does not charge you for applying for an Audio Unit
> vendor ID. It's just free.

You can't compare us to Apple... Apple will make money from
your use of iOS whether you pay them or not, because your
customers have to buy hardware from Apple. And then Apple is
taking a cut of every sale you make anyway. MMA doesn't get any
revenue when MIDI products are sold, or from you when your
products are sold. The only way we get any money from your use
of MIDI is if you pay for an ID or join MMA. So that's why we
charge for those two things.

Frankly, I'd like to see all costs for using MIDI and belonging
to MMA be eliminated, but to do that requires some alternative
funding source, which I am not seeing.

> I'm fine if MMA decides just to restrict its role just to
> keep the MIDI standard which we all love and respect for what
> it has been many years. We will and already did find other
> ways to innovate. But naively as I am, I like to reach out my
> hand to MMA for finding a way to discuss more seriously and
> effective possible future advancements of MIDI. And I am sure
> that others from this group would be interested to put in
> their input and help also. We may be indie companies, but not
> hobbyists. There will be soon (if not already there) the
> point where collectively more MIDI messages sent on iOS
> devices than ever on hardware gear. The 'industry' is changing.

I would like to encourage you (and anyone else in this group) to
share any ideas you have for evolving MIDI.

- TW

Tom White (MMA)

unread,
Apr 18, 2012, 3:54:14 PM4/18/12
to open-music-app...@googlegroups.com
Hi Rob,

> we get comments in our ratings about how our app is broke
> because common omni behavoir is to simply substitute zero
> over all channels (breaks pitch bending). setting bend width
> and making a span of identical patches to accomodate it is
> much like vcr programming. i cant tell users how to do it,
> because it is different on every device, and they cant figure
> it out from their own docs. (i have a korg karma, on which
> the setup is doable, but ridiculous).

Yes, softsynths often don't implement MIDI Channels properly,
and I don't know for sure why that is.

Besides your case (where you work around the typical bend
limits in synths by transitioning the note to a new Channel)
having multiple MIDI Channels also makes it possible to
better emulate behavior of instruments that typically have
different bends per note (such as stringed instruments)...
each MIDI Channel has it's own bend message as well as LFO,
timbre, etc. Steinberg recently created a new message just
for VSTs called "note expression" to achieve this result,
but it has no MIDI counterpart and thus can't be used with
external devices or saved in an SMF... and they wouldn't have
had to do that if VST plugins just implemented MIDI Channel
numbers.

In your case, I think there may be an additional issue about
how to reconfigure a synth that actually supports them, and
I'm not sure how to solve that... but maybe there is a setup
message that could be defined.

- TW

Tom White (MMA)

unread,
Apr 18, 2012, 3:54:14 PM4/18/12
to open-music-app...@googlegroups.com
> I'd really like to know what the MMA standardisation
> activities are. I mean, the MIDI Spec hasn't changed for
> years, are there new things in the pipeline? Would you have
> to some how join the MMA to actually get access to this
> information? In what way can we influence the MIDI Spec
> towards iOS, if indeed that needs to be the case?

I touched on this a moment ago in another email... but maybe I
should expand on that...

re: "hasn't changed"

The original MIDI 1.0 spec was only 8 pages. That document
is now 50 pages and has been joined by 40-50 more, all the
result of negotiations among members of MMA for more definitions
and recommended practices. So, the syntax and message space
have not "changed", but what is "MIDI" changes continually.

re: "new things"

Current topics in MMA include:

- MIDI support in HTML5 (might not be happening if MMA hadn't
helped push for it)
- New electrical spec (updating circuit with modern designs
including 3.3v support, to help avoid potential problems
from mis-matched designs)
- MIDI payload for Ethernet AVB (so MIDI is not left out of
new audio networking products supporting AVB)
- HD Protocol (new "MIDI-like" protocol for applications
needing capabilities MIDI does not have, or has poorly)

re: joining

Yes, you have to join to get the information, because if the
information (details) were public then there would be nothing
stopping over-eager people from trying to go to market early
and accidentally creating incompatible products, which would
damage not just their market but the perception of MIDI as a
whole. If a license was required to use MIDI, then we might
not need to control access at the development stage... but
this is what we have to work with <g>.

re: influencing

Joining MMA is the best way to influence what MMA does (and
by extension, how MIDI works). However, I joined this list
so that I could learn what your community wants from MIDI, and
share that information with MMA members as needed, so even
without any of you joining, you will have *some* influence.

> > I propose that the MMA provides
> > one Manufacturer-ID that has to be used by all iOS-Apps.
> > Additionally the Sys-Ex-Standard is extended for
> > iOS-Devices: Beside the iOS-Sysex-Manufacturer-ID an unique
> > Application-ID is transmitted. The App-ID is generated using
> > the mechanisms provide by iOS.

This can be done, I suspect. I would need to bring it up and
see what MMA members say...

And yes, Apple could instead make their ID available for use in
this way, if they wanted to, since it is an Apple OS feature.

I think MMA would prefer Apple not make the Emagic ID available
in this way, since iOS is not an Emagic product.

- TW

Tom White (MMA)

unread,
Apr 18, 2012, 4:15:14 PM4/18/12
to open-music-app...@googlegroups.com
Clarification:

I wrote...

> The original MIDI 1.0 spec was only 8 pages. That document
> is now 50 pages and has been joined by 40-50 more

I meant 40-50 more *documents*... hundreds and hundreds of pages.

- TW

hans

unread,
Apr 18, 2012, 8:20:37 PM4/18/12
to open-music-app...@googlegroups.com
I forsee the pitch bend issue becoming a critical area for improvement to the standard, especially on iOS apps.

Perhaps this group is a good place to discuss an iOS specific fork of the Midi protocol. Could we do it in an intelligent way that would neither break the system for devices that don't understand nor prevent iOS devices using the fork protocol from working with possible official fixes tonthe real standard in the future?

Hans Anderson
Blue Mangoo Software

sup...@onereddog.com.au

unread,
Apr 18, 2012, 10:10:32 PM4/18/12
to open-music-app...@googlegroups.com
>
> Yes, softsynths often don't implement MIDI Channels properly,
> and I don't know for sure why that is.
>

In my limited experience I would say there's a general miss-undertanding of MIDI Channels vs Omni vs Multi-Trimbral. It wasn't until I implemented this properly in Arctic with Rob's help that I knew this. So perhaps there's a general lack of specific MIDI knowledge. Of course I am speculating.


> Besides your case (where you work around the typical bend
> limits in synths by transitioning the note to a new Channel)
> having multiple MIDI Channels also makes it possible to
> better emulate behavior of instruments that typically have
> different bends per note (such as stringed instruments)...
> each MIDI Channel has it's own bend message as well as LFO,
> timbre, etc. Steinberg recently created a new message just
> for VSTs called "note expression" to achieve this result,
> but it has no MIDI counterpart and thus can't be used with
> external devices or saved in an SMF... and they wouldn't have
> had to do that if VST plugins just implemented MIDI Channel
> numbers.
>

That's interesting. That sounds like Rob's note-tie NRPN. Perhaps these are the sort of things that need to find their way into the MIDI spec, rather than possibly multiple implementations. Also, in Arctic I implemented what I called poly-pitch bend by using an internal NRPN which let you bend each note, basically 14-bit bend associated with a note number.

Though for stringed instruments, what did Roland do with their MIDI guitar synths?

cheers
peter


Rob Fielding

unread,
Apr 18, 2012, 10:21:49 PM4/18/12
to open-music-app...@googlegroups.com
> That's interesting. That sounds like Rob's note-tie NRPN. Perhaps these are the sort of things that need to find their way into the MIDI spec, rather than possibly multiple implementations. Also, in Arctic I implemented what I called poly-pitch bend by using an internal NRPN which let you bend each note, basically 14-bit bend associated with a note number.
>
> Though for stringed instruments, what did Roland do with their MIDI guitar synths?
>
> cheers
> peter
>
>

i tried to get around using the standard note+bend model of MIDI, but
gave up and decided that there isn't a better way without dismissing
MIDI entirely. of all the things that could go wrong, i wanted to
ensure that even the most incompatible pair of devices at least gave
the right pitch (even if there were breaks in the bend, etc).

i don't know if the Omni behavior commonly implemented is technically
'wrong', though the spec itself doesn't really outlaw just rewriting
all channels to '0' as what seems to happen in almost every Omni
implementation. with the existence of the same note in different
parts of the instrument, this can actually require notes to be in a
different order to satisfy omni. etc:

ch1 on 33 127
ch2 on 33 127
ch1 off 33 127

Most omni implementations are silent at the end of this sequence,
where what is intended is that two independent notes were playing at
second step, and there is one note playing at the end. Emulating
picking technique requires this. So this is one of the severely
complicated passages in my MIDI code to cause the omni (or 1 channel
case... same thing) to reorder things and count ups vs downs in same
channel and only send all notes down at the end. And of course the
MIDI spec explicitly leaves this result up to the implementer, where
if the controller and synth assume differently then you can get a
stuck note. O_o


--
http://rrr00bb.blogspot.com

sup...@onereddog.com.au

unread,
Apr 18, 2012, 10:46:37 PM4/18/12
to open-music-app...@googlegroups.com


Isn't it Mode 4 which a lot of the manufacturers implement differently for multi-timbral? So rather than being mono they are in fact treating it as poly. Either that or they are doing what I originally wrote and simply matching note numbers ignoring the channel, oops, which I imagine a lot of soft synths do - it's easy, quick, the vast majority of examples do that, and people just copy what's there already.

Rob Fielding

unread,
Apr 19, 2012, 12:34:23 AM4/19/12
to open-music-app...@googlegroups.com
The unfortunate truth of the matter is that I write an app and test it
against all devices I have, otherwise follow the standard for stuff
that I don't have, to maximize what works. If I can't make what I
need out of the core stuff that everything understands, then it's
irrelevant anyway (ie: 0x90,0x80,0xE0). So I build it all out of
abusing channel and bend quirks, and make sure that things still kind
of come out right when devices don't recognize any NRPN I needed to
get full fidelity with the gestures.

> Isn't it Mode 4 which a lot of the manufacturers implement differently for multi-timbral? So rather than being mono they are in fact treating it as poly. Either that or they are doing what I originally wrote and simply matching note numbers ignoring the channel, oops, which I imagine a lot of soft synths do - it's easy, quick, the vast majority of examples do that, and people just copy what's there already.
>

Modes is another case of trying to enumerate all the possibilities
explicitly. I would not use wierd modes even if they were defined. I
abuse the primitives that I know exist everywhere, and let the
controller take on that complexity to keep it out of the protocol and
the synth. For example... I turn the notes on and off myself rather
than relying on solo/poly mode and do the legato manually from
controller because i have to hop across channels anyway (because
note-overlaps are per channel...but in my case...every note is on a
unique channel). As a benefit, this is how string instruments
behave... you can legato around on one string while ringing chords and
diving a whammy bar all at the same time. Just for microtonal scales,
you need to leave the dang pitch wheel for any on or releasing (ie:
turned off but still releasing) alone; which prevents me from simply
equating polyphony groups (ie: a string) with channels. You can't get
around the 1 note 1 channel requirement.


--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 19, 2012, 1:05:46 AM4/19/12
to open-music-app...@googlegroups.com

It is possible to implement per-note pitch bend in MIDI, assuming
you don't mind using NRPN or SysEx messages.

It would not need to be an "iOS" specific message... even if mostly
only iOS apps end up using it.

There is only one MIDI, though there are different recommended
practices for using it in different markets.

- TW

Tom White (MMA)

unread,
Apr 19, 2012, 1:05:46 AM4/19/12
to open-music-app...@googlegroups.com
> ch1 on 33 127
> ch2 on 33 127
> ch1 off 33 127
>
> Most omni implementations are silent at the end of this sequence,
> where what is intended is that two independent notes were playing at
> second step, and there is one note playing at the end.

I expect a receiver in OMNI Mode to make no distinction between
Channels, in which case this sequence is the same as if the device
received the events on Channel 1... and most synths would see that as
an invalid sequence, because it's not a sequence that would ever be
generated (normally) by (traditional) MIDI controllers. There is no
standard for response to that sequence, but I'd expect most synths to
stop both notes upon receipt of the Note Off (just as you observed)
because they assume the second Note On was a mistake.

If you found some synths that do not stop both notes, it is probably
because they are respecting the MIDI Channel, which in OMNI mode is
a little weird if you ask me. But honestly, what is "normal" for OMNI
Mode is not something I've ever seen discussed in MMA, so I've got
only my own opinion to go on.

What's clear is that any synth that respects the MIDI Channel will
play your sequence correctly. That said, if a designer chooses not
respect MIDI Channels, that is their choice, but I'm not sure they
understand the consequences, nor do their customers.

> Emulating picking technique requires this.

I don't understand that statement...
It seems to me when picking one string, you would be retriggering
each time, so there is no need for the first Note On to keep playing.

- TW

Tom White (MMA)

unread,
Apr 19, 2012, 1:05:46 AM4/19/12
to open-music-app...@googlegroups.com
> in Arctic I implemented what I called
> poly-pitch bend by using an internal NRPN which let you bend
> each note, basically 14-bit bend associated with a note number.

Yes, synth makers have used NRPN for per-note controllers in the
past (mostly things like individual drum panning in drum sets)...
but there is no MMA standard for NRPN-based per-note controllers.
I think the reason is that the complexity and confusion from
using NRPNs overrides the desire for per-note controllers in most
cases <g>.

For General MIDI 2, we did create Universal SysEx messages for
per-note control, but other than in GM2 devices, I'm not aware
of anyone using those messages. There again, I think the hassle
of using SysEx overrides the desire to have per-note control <g>.

> Though for stringed instruments, what did Roland do with
> their MIDI guitar synths?

Each string is on a different MIDI Channel: i.e. one note per Channel.
Then Channel pitch bend, vibrato, etc, only effect one note at a time.
It's the simplest way, as I was saying, to get per note control, as
long as you don't need more than 16 different notes at once. And it
allows using the guitar with almost any synth (since they are almost
all multitimbrel these days, except some software ones).

- TW

Rob Fielding

unread,
Apr 19, 2012, 1:23:17 AM4/19/12
to open-music-app...@googlegroups.com
http://www.youtube.com/watch?v=pnKXLfWgSCk&feature=plcp&context=C4b189c7VDvjVQa1PpcFOt09fEFLsejTVq3UjpOuF_DOxY4AOYnt8%3D

1) every note down goes to a new channel, even if on same string.
this is because if the patch had a long release we want to be sure
that the channel is quiet before we come around to needing to steal it
(and mod its pitch wheel).

2) in this scale, every single note happens to have a different bend
value. nothing is a multiple of 100 cents apart other than octaves,
especially since everything bends a little bit due to tiny finger
movements.

3) 2:19 - i am messing with the polyphony modes. in this single note
mode, i am hammering on/off same note from different locations (note
attack on both finger up and down in that case, ... fast picking). in
string poly mode, there is only note attack on first note-down, and it
legatoes everything else. but even when legatoing, it still has to
hop channels due to both release issue and nothing being a multiple of
100 cents apart.

that was thumbjam, whose "Omni" mode maintains a separate pitch wheel
per channel - a highly unusual thing for a synth to do (correctly I
believe). to do this with Korg Karma, I set it to run across channels
1-8 and set pitch bend to +/-12 semis and everything is ok as long as
I don't manage to bend farther than an octave. if you exceed bend
width, it will stop note and retrigger it (again, on another
channel...because it might be a long-release patch).

all of this is kind of allowed by MIDI, but it's totally random on
which synths do it well. this is a guy using it with a breath
controller versus a Nord G2:

http://soundcloud.com/rrr00bb/alephone-g2-string-and-breath

He told me that the setup was really a major pain. That's what we
live with for hardware, but it won't fly on iOS.

--
http://rrr00bb.blogspot.com

sup...@onereddog.com.au

unread,
Apr 19, 2012, 1:50:41 AM4/19/12
to open-music-app...@googlegroups.com
On 19/04/2012, at 3:23 PM, Rob Fielding wrote:

all of this is kind of allowed by MIDI, but it's totally random on
which synths do it well

That's the point, not every synth will implement it as expected. Some people simply interpret the MIDI spec differently, frustrating as that is. As you know I got this wrong, until I saw the light. If I had more time I'd spend it implementing more MIDI.


He told me that the setup was really a major pain.  That's what we
live with for hardware, but it won't fly on iOS.

He should get the Lead 2X :-)
Set the channel span to 4, put the same patch on the Nord into each A,B,C,D slot, set the MIDI channels to 1,2,3,4, done. Doesn't get any easier :-)
In iOS you should build the synth into the app, then the user has nothing more to do.

nlogmusic (TempoRubato - NLogSynth)

unread,
Apr 19, 2012, 2:59:29 AM4/19/12
to open-music-app...@googlegroups.com
Hi Tom,

GREAT thanks for answering to my little morning rant <g>!

I just did two posts about the idea of sharing an OMAC ID. 
Only one little remark here: Asking Apple to reuse their ID
is not working, because Apple does not respond to anything
we ask. In term of developer communications you feel like
being in a Kafka novel like "Das Schloss" 

Another valid point is, that in terms of secrecy OMAC and MMA
are not very compatible. This group here is completely public
and open. What ever you write here becomes more or less prior art.
It think this works very well for iOS software devs
since they do not have these long pre-production times and
investments you have to do in hardware business and you want
to protect. We are small here, we are not Steinberg, Ableton,
Propellerhaed etc. which need to act like little Apples with all that
secrecy. The idea of OMAC is that secrecy hinders more and
we are gaining more community value by exchanging ideas,
experiences etc. Sure, this does not mean that we have trade
secrets. But this group is not the place to post them ;-)

From this fact, we may see upcoming things as proto-standards 
under the umbrella of an OMAC SysEx ID. However, there is no reason
why these could not be a base for a new formal MMA standard.

Having you listening and participating here is of course great help.
And again, if you like people from here to support in the process 
from an OMAC proto-standard to a formal one, I think there will
be helping hands here found. 

Best
Rolf

Tom White (MMA)

unread,
Apr 19, 2012, 1:35:15 PM4/19/12
to open-music-app...@googlegroups.com
Rolf,

I just did two posts about the idea of sharing an OMAC ID. 
Only one little remark here: Asking Apple to reuse their ID
is not working, because Apple does not respond to anything
we ask.
I understand, but Apple is an MMA member, and they respond to
us... and since this is their technology (not MIDI, but CoreMIDI and
iOS) we think they should have the opportunity to consider how they
want this accomplished, if they even care... keep in mind that since
they control the platform they also could someday decide they do
not want this to work and so create problems, so we need to make
sure they are on board, for your own benefit <g>...
Sure, this does not mean that we have trade
secrets. But this group is not the place to post them ;-)

From this fact, we may see upcoming things as proto-standards 
under the umbrella of an OMAC SysEx ID. However, there is no  
reason why these could not be a base for a new formal MMA  
standard. 
MMA does not need to create every standard internally or in secret.
We can also adopt something created outside.
 
We work with numerous organizations to help them decide how to
use MIDI, and sometimes they publish the documents and other
times we do. Consider RTP-MIDI, for example.
 
We are primarily interested in making sure MIDI is done right. You
might be surprised how often people think they know what they are
doing with MIDI, and don't really...
 
But we are also interested in making sure that the right solution is
able to be promoted fully... OMAC has been very successful, but
because it is an adhoc group, it might not always exist (and, for
example, Apple does not participate) so certainly we think it may
be good for MMA to formally adopt whatever OMAC comes up with.
 
- TW

Tom White (MMA)

unread,
Apr 19, 2012, 2:15:01 PM4/19/12
to open-music-app...@googlegroups.com
rob,

> 1) every note down goes to a new channel, even if on same string.
> this is because if the patch had a long release we want to be sure
> that the channel is quiet before we come around to needing to steal it
> (and mod its pitch wheel).

You seem to be saying you need every note down to go to a new
Channel so that you do not shorten the previous release... but
my sense is that with a traditional instrument the act of picking
essentially (but not exactly) silences the previous strike (and
even if it doesn't exactly silence it, there is very little
evidence that letting the prior note continue adds any real value)
so I'm not sure why you feel it is necessary to do that in your
case. But on the other hand, I understand that you need to change
the Channel number any way (for bending) so I guess maybe it makes
more sense to just do it all the time rather than to try and figure
out when you might actually need to do it.

Anyway, I think your idea to put every note on a different Channel
makes sense, and would work fine if all synths respected Channel
numbers. I think the ones that ignore Channel numbers do so because
they thought the only reason for them was to be multitimbrel (play
more than one sound at once, basically being up to 16 different
synths) which was not their intent. But in fact, respecting Channel
numbers is also needed for guitar synthesis (or any synthesis that
requires per-note expression). So this is probably an unfortunate
misunderstanding by the software synth community.

> that was thumbjam, whose "Omni" mode maintains a separate pitch
> wheel per channel - a highly unusual thing for a synth to do
> (correctly I believe).

Just curious, when you refer to "Omni" mode here, do you mean
"MIDI OMNI Mode"? How are you selecting Omni Mode? I'm just
wondering whether the synth makers are using the term to mean
something other than MIDI OMNI Mode.

As far as I know, in MIDI, OMNI Mode means "Listen on all MIDI
Channels". I don't know that the MIDI Spec ever said that means
to just ignore the MIDI Channel number, but I also don't know
that the MIDI Spec ever said you couldn't... I suspect this is
one of those issues where the Spec is vague, and companies just
do what they want. (Sorry about that <g>).

- TW

Rob Fielding

unread,
Apr 19, 2012, 3:01:42 PM4/19/12
to open-music-app...@googlegroups.com
Hi Tom, for the benefit for others who haven't heard the detail, I
will restate the corner cases we encountered. The issue is that notes
are not dead when they are turned off. They go into a releasing phase
that takes more time to complete. You could even have all channels
still consumed even though all notes are 'off', because they haven't
had time to release. Ie:

ch1 on 33 127 #note A on
...
ch1 bend +25% #bend it up 50 cents (+/-wholetone bend is 100%)
...
ch1 on 33 0 #turn it off ... release phase for 1 more second
...
ch1 bend 0% #if 1 second has not passed, we messed up 33's release pitch
ch1 on 35 127

Kontakt4 specifically is where we found this problem, and it makes
sense. If you channel cycle through 16 channels that all have
independently bent notes, then this actually imposes a speed limit at
which you can render pitches correctly. If each note took 1 second to
release due to the patch, then you can end up stealing channel and
changing its pitch wheel before the release is done. So you at least
try to steal the channel that has been off for the longest time to
minimize the problem.

And like before, if you take this MIDI sequence:

ch1 on 33 127 #1 voice
ch2 bend +25%
ch2 on 33 127 #2 voices same note name, one quartertone sharp
ch1 on 33 0 #1 voice, sharp pitch is still sounding
ch1 on 33 0 #silence

And do what almost every Omni mode does to it, rewrite channels:

ch0 on 33 127 #1 voice
ch0 bend +25% #voice goes quartertone sharp
ch0 on 33 127 #1 re-attack same note
ch0 on 33 0 #early silence - this is wrong! it's a dropped note.
ch0 on 33 0 #silence

So, omni gets the rendition comically wrong. Multiple voices on same
note happens all the time on string instruments, and can even be used
to generate a manual chorusing effect at the MIDI level without
post-processsing. And here's a gem that will be a stuck note on some
instruments:

ch0 on 33 127 #1 voice
...
ch0 on 33 127 #1 re-attack same note
ch0 on 33 0

If synth interprets vol 0 to be to set the state of the channel, then
note is off here. If synth interprets it to be two instances of note
33, then it turns one off and leaves a stuck note. You don't know
which one of the instances is turned off even (different phases of
same note). I think the only interpretation that should be allowed is
the first one, where it's setting volume *state* of the note. If you
need to stack multiple instances of same note name, then that's what
we have to do with channels anyhow.

And there's this gem:

ch0 on 33 127
ch1 on 37 127
ch2 on 40 127
//divebomb a whammy bar...
ch0 bend -10%
ch1 bend -10%
ch2 bend -10%
ch0 bend -20%
ch1 bend -20%
ch2 bend -20%
ch0 bend -30%
ch1 bend -30%
ch2 bend -30%
...

You have to duplicate the bends for every active note because you had
to put everything on separate channels. This means that when you set
pitch bend size, you have to set it individually for each channel
span. You have to duplicate any overall channel pressure messages to
each channel, etc.

All this stuff conspires to make the MIDI implementation maddeningly
complex. I had to rewrite Geo's engine 5 or 6 times from scratch
before I could not find a device that gave me stuck notes (and
still... some Omnisphere and Maschine users still managed to send in
rare reports of it happening anyway.). I love the *idea* of OSC,
because this junk is dealt with upfront, and this part is easier to
look at and say that it's fixed.

And the note-ties fill in the last major crack in the spec, in that
there isn't an equivalent to the note-tie notation in music to
continue a note under a different name; either because we go beyond
100% bend width in either direction, or just because we are legatoing
two notes together. So there's this:

ch0 bend 0%
ch0 on 33 127
ch0 bend +25%
ch0 bend +50%
ch0 bend +75%
ch0 bend +100% #semitone up
ch1 note-tie #next off/on pair are an atomic note tie
ch0 on 33 0
ch1 on 35 127
..
ch1 on 35 0 #turn it off now

So, synths that have no idea what a note-tie is, will not get stuck
notes or wrong pitches. But they will get a re-attack in the note
when it reaches a semitone. Synths that recognize the note-tie
(SampleWiz,ThumbJam,Arctic,AlephOne) will just get this as a smooth
bend up two semitones. The note started on ch0 and ended up on ch1.
The channels aren't distinct instruments, but just places to associate
bend and timbre. Note that to be perfectly smooth, we have to *also*
transfer cc effects from the old channel to the new channel. All this
causes the simple requirement of "just render correct pitches" to turn
the messaging into a mess on the client end.

Also note that if you do this, and were turning an untuned guitar's
audio output into MIDI that the MIDI messaging comes out in tune with
the audio (!!). Ie: tune guitar by ear 1/4 tone sharp, and run it
through an audio to midi interface, and the audio and MIDI are
actually still in tune with each other even though all the notes fall
half-way in between standard tuning now. This is a huge issue that
cripples the usability of MIDI for almost every use beyond a keyed
instrument.

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 19, 2012, 4:33:23 PM4/19/12
to open-music-app...@googlegroups.com
> Hi Tom, for the benefit for others who haven't heard the detail, I
> will restate the corner cases we encountered. The issue is that notes
> are not dead when they are turned off. They go into a releasing phase
> that takes more time to complete.

Oh, sure, I get it now, I think...
The problem (for me understanding) is that I am used to the typical
behavior of a synth using a single Channel, and fail to think through
how behavior will naturally be different when using multiple Channels
for the same "synth"...

In a typical MIDI synth, if you retrigger a note on a Channel, the
synth most likely will *not* allocate a new voice to the note, because
it assumes there is only one instance of each note per Channel. As a
result, that first voice is freed-up when playing the same note again.
But when you move the second instance to a different Channel, that same
synth won't realize you are trying to play the first note again, and
it will let the first note complete the release phase. So this is most
certainly a problem you will need to work around.

> Multiple voices on same note happens all the time on string instruments

Multiple instances of the same note can be played on string instruments
because they have multiple strings, a paradigm that, in MIDI, requires
just one MIDI Channel per string... what you are using is multiple MIDI
channels "per string", so you are doing something completely different.

I think it is amazing that you have found ways to make MIDI do stuff
no one considered doing...

> And the note-ties fill in the last major crack in the spec, in that
> there isn't an equivalent to the note-tie notation in music to
> continue a note under a different name; either because we go beyond
> 100% bend width in either direction, or just because we are legatoing
> two notes together.

I believe the Legato Controller CC accomplishes ties just fine, but of
course only on a single Channel, which doesn't work in your scenario.

> tune guitar by ear 1/4 tone sharp, and run it
> through an audio to midi interface, and the audio and MIDI are
> actually still in tune with each other even though all the notes fall
> half-way in between standard tuning now. This is a huge issue that
> cripples the usability of MIDI for almost every use beyond a keyed
> instrument.

Sorry, you lost me again <g>... please explain that, if you don't mind.
Especially how MIDI isn't usable for non-keyed instruments.

- TW

Rob Fielding

unread,
Apr 19, 2012, 5:04:02 PM4/19/12
to open-music-app...@googlegroups.com
>> tune guitar by ear 1/4 tone sharp, and run it
>> through an audio to midi interface, and the audio and MIDI are
>> actually still in tune with each other even though all the notes fall
>> half-way in between standard tuning now.  This is a huge issue that
>> cripples the usability of MIDI for almost every use beyond a keyed
>> instrument.
>
> Sorry, you lost me again <g>...  please explain that, if you don't mind.
> Especially how MIDI isn't usable for non-keyed instruments.
>
> - TW
>

When you combine pitch and bend, you get a frequency. Notes don't
actually exist. Everything needs to be though of in terms of creating
the correct frequencies. (I am trying to make MIDI take on a
frequency orientation, because I can't use OSC for other reasons.)
So, say that you have an FFT listening to signals per guitar string
and c0 is the frequency of MIDI note zero:

440hz - fine, that's MIDI note 33. Forgive me if I get any of the
math wrong as I do off the top of my head. The frequency is
symbolically:

c0 * 2^(33)

But what if the guitar is tuned 50 cents sharp overall. There is no
reason that everything should break because of this. It should just
work. If it doesn't then you can't mix the original audio with the
MIDI to get a meaningful result. So "A" is the frequency:

c0 * 2^(33.5) = (440hz/(2^33)) * 2^(33.5)

So, systematically, the whole guitar adds 0.5 to the exponent for
creating the pitch. Everything should just sound right. The MIDI
would transcribe weird because nothing really falls on proper "note",
but notes don't really exist anyway. If you don't do this basic thing
right, then how would you create a microphone that creates proper MIDI
signals for a singer's voice? You would either have to auto-tune the
voice, or produce the singer's pitches. They must match. A singer
that knows what she is doing would demand that you produce the proper
pitch.

So, Just intonation in actual pitch ratios:

1/1, 16/15,..., 13/12, ..., 9/8, 6/5, 5/4, 4/3, ... , 3/2, ..., 2*8/9,
2*15/16, 2

These are spots where there are standing waves that strongly resonate,
and are the phenomenon responsible for the emergence of scales that
are more or less similar among cultures. But fretted instruments
doing equidistant fretting and octave equivalence will insist upon
these values:

2^(n/fretsPerOctave)

Common values are 7,12,19,31,53,etc; Because they provide close
approximations to the integer ratios that happen physically (which the
harmonic overtones follow exactly). As you may know, even acoustic
pianos don't follow 2^(n/12) strictly. They stretch the octave to
provide a better balance in matching overtones in the harmonic series.
If they don't do this, the piano doesn't resonate correctly. The
laws of physics are in direct conflict with the desire to have exactly
equidistant pitches.

Then there is the issue of "portamento", which causes any discrete-key
instrument to produce highly unnatural pitch bending. If I press note
33 at time 0:00 and note 35 at time 1:00, with portamento, at time
1:00 the pitch is still at pitch 33 and starts ascending to pitch 35
at some speed. It's the wrong pitch at 1:00. On a fretless
instrument, at time 0:00 it's the right pitch, and when time 1:00
comes it's also on the right pitch, and in the intervening time from
0:01 to 0:59 the finger was bending towards its destination. So,
note-overlaps to handle solo-mode and legato and portamentoing is just
not correct. All change of pitch, turning notes on and off, and
setting whether note re-attacks needs to be handled in the controller
explicitly. The controller knows: 1) where the finger "is" (ie:
33.1534), 2) where it's auto-tuning to (ie: 33.0), 3) and where the
pitch is between the finger location and the auto-tune destination:
(ie: 33.0001). The synth really only needs to know 3). In MIDI, it's
upside down from that perspective. All of this has to do with the
fact that the controller has a lot more brain-power, more input
subtlety, and a much more comprehensive display for real-time
information.

So, the point of this is that fretlessness is not a wierd corner-case
for people doing experimental music. You need it for every instrument
that isn't made up of a fixed set of pitches per octave. This is a
major reason keyboard-based instruments can't cross the uncanny valley
into doing what violin,voice,guitar,sax, etc can do.


--
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 19, 2012, 5:21:11 PM4/19/12
to open-music-app...@googlegroups.com
errr.... i need to divide by 12 for those frequencies because chromatic scale:

c0 * 2^(33/12)

and so on.

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 19, 2012, 7:12:51 PM4/19/12
to open-music-app...@googlegroups.com
> When you combine pitch and bend, you get a frequency. Notes don't
> actually exist. Everything needs to be though of in terms of creating
> the correct frequencies.

Ahh.
Yes, I am aware of that perspective.
And that MIDI doesn't really support that perspective.

But to be clear, I don't think there is a right or wrong perspective,
because MIDI's success proves it is the "right" perspective, unless
you happen to have an application that needs to treat notes as
frequencies, in which case MIDI is no longer right <g>.

We address this in the new HD Protocol by having both models (i.e.
Notes can have default pitches, like MIDI, or assigned frequencies).

> But what if the guitar is tuned 50 cents sharp overall. There is no
> reason that everything should break because of this. It should just
> work. If it doesn't then you can't mix the original audio with the
> MIDI to get a meaningful result.

I'm not sure what part doesn't "work" in your example. I am pretty sure
you can tell any guitar-midi sensor what frequency is your reference
and it will figure out the correct MIDI Notes just fine.

> how would you create a microphone that creates proper MIDI
> signals for a singer's voice? You would either have to auto-tune the
> voice, or produce the singer's pitches. They must match. A singer
> that knows what she is doing would demand that you produce the proper
> pitch.

Again, I think establishing a reference pitch accomplishes this.
People who want to perform together or transcribe their performances
also need a reference pitch... they don't just randomly sing/play any
frequency they want.

I understand you are pursuing a different paradigm than what other
people have done, and I applaud that.

I'm just trying to temper your frustration by pointing out that these
"problems" you are solving are because you've decided to do something
that is very unconventional, not because everything that has been done
so far is just wrong <g>.

After you've figured out a system that supports what you want to do,
I am sure most people will still want to keep doing things the way
they've always done them, so their way won't ever be "wrong" <g>.

> Then there is the issue of "portamento", which causes any discrete-key
> instrument to produce highly unnatural pitch bending.

Yeah, you can't really make a synth that does "portamento" unless you
can manually (in real time) bend one note to any other, which is not
what "Portamento" on a synth does.

> The controller knows: 1) where the finger "is" (ie: 33.1534),
> 2) where it's auto-tuning to (ie: 33.0), 3) and where the pitch is
> between the finger location and the auto-tune destination:
> (ie: 33.0001). The synth really only needs to know 3). In MIDI,
> it's upside down from that perspective.

Yes, MIDI was conceived as a means to document the actions being made
on a controller (the performance), not to document the desired output
of a synth (the results). MIDI's approach has been highly successful
because until MIDI there was no good way to do that, but both approaches
have their value.

What I believe is that MIDI needs to keep working the way it does, and
something else should work the other way (for those who need that). If
it is possible to make MIDI do both, that's great, but I don't see it
yet. Maybe HD, though <g>.

> So, the point of this is that fretlessness is not a wierd corner-case
> for people doing experimental music. You need it for every instrument
> that isn't made up of a fixed set of pitches per octave. This is a
> major reason keyboard-based instruments can't cross the uncanny valley
> into doing what violin,voice,guitar,sax, etc can do.

Nah, the reason keyboard-instruments can't do what those other instruments
can is keyboard players <g>. I've seen two people play the same keyboard
and sound entirely different (and I don't mean they selected different
patches. Likewise, I've seen people play realistic guitar or wind passages
on a synth (not many times, but a few).

But that said, sliding all the way up and down like a trombone (or over
the full range of any stringed or wind instrument) is not something to
expect from MIDI, and the fact that you are making it work is fantastic.

- TW

Rob Fielding

unread,
Apr 19, 2012, 8:20:03 PM4/19/12
to open-music-app...@googlegroups.com, <open-music-app-collaboration@googlegroups.com>
Yeah, i agree with this. I want to go the route of pseudo-compatibility so i can get off the ground, and buck the standard with no regrets where i must do so.

What changed tho? Things used to be driven by keyboard controller industry. Now it will be dominated by glass surfaces.

Sent from my iPhone, which is why everything is misspelled. http://rfieldin.appspot.com
http://rrr00bb.blogspot.com

sup...@onereddog.com.au

unread,
Apr 19, 2012, 8:44:35 PM4/19/12
to open-music-app...@googlegroups.com
What would be a good start is for the MMA or MIDI Spec writers to take your note-tie NPRN and add that as a RPN. IMHO it would be much better if it was an RPN, alongside Bend Range & Master Tuning. Also any vagueness in the spec re: channel notes should be cleared up, but obviously that's not going to help the 1000's of synths that have already been written with incompatible omni modes. So in that sense you need a synth that has been designed with the idea of non-keyboard based controllers.

Rob Fielding

unread,
Apr 19, 2012, 9:30:12 PM4/19/12
to open-music-app...@googlegroups.com, open-music-app...@googlegroups.com
What i mean by "just work" is that fundamental things require no setup.  If I sing a pitch of 425.54hz into a microphone, the default setup will emit a midi note at that frequency, and it doesnt matter what midi note is used to render it.  Doing anything else is adding to the setup nightmare that it already is.  

If all that midi did was render correct pitches, about 1/3 of all the controls in geo disappear.  No more special modes to deal with the reality that many synths can only manage chromatic notes.  No control to tune the midi versus the internal engine.  No bend width setting.  No channel span setting.  If i controlled note phase (noteties) as just a part of the protocol then enumerating solo mode cs poly mode vs string mode is of no concern outside my controller; i turn notes on/off and sett pick attack myself, an protocol and synth stay simple.

Tom White (MMA)

unread,
Apr 19, 2012, 9:42:41 PM4/19/12
to open-music-app...@googlegroups.com
> What would be a good start is for the MMA or MIDI Spec
> writers to take your note-tie NPRN and add that as a RPN.

First, is there any reason not to just use the existing
Legato Mode Message for that?

- TW

Rob Fielding

unread,
Apr 19, 2012, 10:24:14 PM4/19/12
to open-music-app...@googlegroups.com
The note tie crosses channels on every note on, so i need actual control of note phase. Every time i turn a note off, i maximize the off time for the channel. This provides compatibility for synths that only do mulitimbral and setting bed width.

Legato is not a mode
Legato is how you start a particular note (separate from polyphony)
This is just like how polyphony is not a mode

I have a notion of polyphony groups where notes in a poly grouped for hammeron/off, and even here, every new note on is a new channel. Bell like patches with long release arbitrarily bent work properly because of this.

Sent from my iPhone, which is why everything is misspelled. http://rfieldin.appspot.com
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 20, 2012, 10:54:29 AM4/20/12
to open-music-app...@googlegroups.com
I don't see the difference between using an NRPN and using a CC.

MMA could certainly assign a message to signal when one note is
supposed to tie to another, but I believe the Legato Mode Message
already does that, from reading our description.

Before I go to the members and ask them to adopt a Note Tie
message I need to be able to explain why the existing Legato Mode
message cannot be used for that same purpose.

Forget the name "Legato Mode" and just read the description of
what it is for... then please let me know how your message is
different.

Thanks,

- TW

Rob Fielding

unread,
Apr 20, 2012, 11:35:46 AM4/20/12
to open-music-app...@googlegroups.com, <open-music-app-collaboration@googlegroups.com>
The note tie must exist to exceed max bend width (by renaming note and even moving it to a new channel). It just happens to also function as a legato that can cross channels as well.

Having to cross channels is so that notes that dont understand the tie can release at the proper pitch. All of this is about the pitch wheel being tied to a channel.

Sent from my iPhone, which is why everything is misspelled. http://rfieldin.appspot.com
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 20, 2012, 12:02:29 PM4/20/12
to open-music-app...@googlegroups.com
Tom, can you build iOS apps? (Do you have XCode installed, iOS device, etc?).

http://www.github.com/rfielding/DSPCompiler

is the public parts of the code, but AlephOne is the full app (which I
do not want random people posting copies of to the store before I
myself ship AlephOne). It's MIDI is essentially what Geo Synthesizer
does. I rewrote this MIDI engine a half dozen times, with different
ways of trying to work around the pitch problem. This really does
solve it, while still being highly compatible with existing synths.

You can't get around this problem without losing the most basic compatibility:

1) when you turn a note on, you must set its pitch wheel just before
you turn it on:

ch1 bend +5%
ch1 on 33 127

2) when you turn a note off, you cannot touch its pitch wheel until
the note released. you have no idea how long release takes. so you
should pick the channel that has had all notes off for the longest
time

ch1 off 33 0
# ch1 bend should not be called until release is done, but we
don't know how long

This is why you have to cross channels when you tie notes. Almost no
synths understand note ties, and may have a long release. So if I do:

ch1 bend +5%
ch1 on 33 127
...
ch1 bend +50%
ch2 notetie
ch1 on 33 0
ch2 bend 0%
ch2 on 32 127

It's a consequence of (note+bend) determining pitch, the fact that you
have to set the right pitch before you turn note on, and leave the
wheel alone until it's released. Therefore, you can't do solo mode or
legatoes in the same channel.

On a fretless instrument, you may be playing with an instrument that's
doing its own tuning (ie: a digeredoo, Sade Adu, Najwa Karam, or
whatever). If you play the correct pitches, the messages generate
exactly those pitches. If the controller wants to round everything
off to 12ET and standard tuning, then that's easy to do. But it's the
job of the controller, and not the protocol to do this.

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 20, 2012, 4:49:09 PM4/20/12
to open-music-app...@googlegroups.com
Rob,

> -----Original Message-----
> The note tie must exist to exceed max bend width (by renaming
> note and even moving it to a new channel). It just happens
> to also function as a legato that can cross channels as well.

I get that (I think)...
Your message is not just a legato message, it is more.

But the message itself doesn't contain relevant information
for the renderer, it just indicates that "the Next Note is
not to be played as a new voice but instead provides a new
Note Number and Channel Number for the Previous Note"...
right?

What I was thinking is there may already be a message that
indicates that. If so, it would have been intended for a
different purpose, but if the two purposes are complimentary
then MMA might prefer to explain how to use the existing
message for your purpose rather than having two separate
messages. (Part of our process of creating new MIDI Messages
is checking for existing ones that can do the job, to avoid
confusion from redundancy, and to avoid complicating device
interoperability.)

The message I was thinking of I called Legato Mode, but I
went looking for that and it doesn't exist. I found Legato
Footswitch, but that has to be turned on and off, whereas
you want an event that just sits between the two notes it
will effect.

I also found a Portamento Control message, and it's similar
to your usage in that it sits before a Note to modify its
meaning, but unfortunately its attributes are specific to
Portamento, and making it do double-duty will probably create
more work for you and increase the potential for compatibility
problems for everyone...

So I think it might make more sense to just create a new CC
or RPN that defines a "Note Tie Event", but the MMA members
need to discuss that and see which way they feel better about.

- TW

Rob Fielding

unread,
Apr 20, 2012, 5:04:29 PM4/20/12
to open-music-app...@googlegroups.com
Tom,

Thanks for really considering the concept!

First: If you want a copy of the full AlephOne app source that
actually does this, or have an iOS device UDID so you can install and
just look at the messaging in a black box manner, then let me know.
Whether what I am doing has other problems I am unaware of, it is
implemented and battle-tested to some degree
(SampleWiz,Arctic,ThumbJam,GeoSynthesizer,AlephOne) with an open
source implementation (the public DSPCompiler project on github) that
has no dependencies on any other code. You can check Geo as well
(though it doesn't use the note-tie for legato, because it has
problems with the order in which it does things that required me to
rewrite the MIDI engine for AlephOne to get the legato to work. Geo
only uses the note-tie just to solve the max bend width exceeded
problem)

Second: It is true that what I am doing is complicated and messy at
the level of midi bytes; and is not ideal from that point of view. It
actually takes some kind of open source project to hide MIDI byte
generation behind an simpler API to stomach it. But when I heard what
the plan with MIDI HD was, I could not envision how backwards
compatibility would be handled any other way (ie: old synths at least
play correct pitches and ignore unknown messages).

--
http://rrr00bb.blogspot.com

hans

unread,
Apr 20, 2012, 8:48:18 PM4/20/12
to open-music-app...@googlegroups.com
Rob,

I'm following a few steps behind you on a very similar midi implementation in iFretless Bass. Our internal synth engine starts a note with the following information:

{
int noteIDNumber; // we need this to identify each note so that it can independently receive cc messages it doesnt matter how the number is assigned as long as we never use the same number for two notes at the same time.
float samplesPerPeriod; // the wavelength of the note at start time given in samples at 44,100 kHz
float velocity;
bool legato;
}

A bend message:
{
int noteIDNumber;
float newSamplesPerPeriod;
}

A stop message:
{
int noteIDNumber;
}

For us, a protocol that just had those things would be wonderful. It looks like alephOne and Geo would also work well with something like system. Perhaps we would want to add more CC messages for controlling volume and tone of each note after it's already started.

Hans

Rob Fielding

unread,
Apr 20, 2012, 9:47:22 PM4/20/12
to open-music-app...@googlegroups.com
I'm doing something way more primitive internally. My entire synth
engine is driven by a single small function, which keeps the interface
dumb simple:

void rawEngine(int midiChannel,int doNoteAttack,float pitch,float
volVal,int midiExprParm,int midiExpr)

if invoked with doNoteAttack is set to 1 then it's expecting the next
note off to be tied to the next note on. All that it does is ensure
that the note off's state is transferred to the next note on. The
gestures that generate the MIDI are just floating point numbers. They
are like:

if(isMoving)
{
Fretless_move(fretlessp,finger1,note,v,polyGroup1);
Fretless_express(fretlessp, finger1, 11, expr);
}
else
{
Fretless_beginDown(fretlessp,finger1);
Fretless_express(fretlessp, finger1, 11, expr);
Fretless_endDown(fretlessp,finger1, note,polyGroup1,v,legato);
}

note is a floating point value. You have to get rid of the notion of
bends from the front of the interface because it screws up everything
else when you are trying to have simple code at the UI. The note+bend
is only generated when we translate the floating point MIDI number.
Anyways, Fretless.* and DeMIDI.* are here:

http://www.github.com/rfielding/DSPCompiler

My motivation for making it open source is precisely because MIDI is
completely useless to me if all that happens is that users get a bad
result when they plug my controller into their synth. I have been
trying to fork the MIDI standard on iOS to handle exactly what we are
trying to do; I hadn't really taken seriously the idea of influencing
MIDI HD...but we can try that too ;-). This kind of messaging is
basically how the Haken Continuum works btw. People pay $5k for those
things.

--
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 20, 2012, 10:04:05 PM4/20/12
to open-music-app...@googlegroups.com
> The message I was thinking of I called Legato Mode, but I
> went looking for that and it doesn't exist. I found Legato
> Footswitch, but that has to be turned on and off, whereas
> you want an event that just sits between the two notes it
> will effect.

Not between them of course, but before the on and after the off (two
notes in different channels) - because the off part has to be
inaudible. It is assumed that if you send a note tie that the off
note comes, maybe some bends and expression, then the on note. It's
all one atomic sequence that happens at the same time. But yeah, you
totally get the picture.

--
http://rrr00bb.blogspot.com

sup...@onereddog.com.au

unread,
Apr 21, 2012, 4:15:35 AM4/21/12
to open-music-app...@googlegroups.com
Hi

Hans, what you described there looks very similar to what I did with Arctic and the Pitch mode allowing you to bend individual notes from the on screen keyboard. Nice for adding vibrato per note. It's sort of an expanded polyphonic aftertouch. If the original MIDI spec had included one extra byte in the Pitch Bend message to include note number along with channel, this wouldn't be required.

The note-tie is different. Here's the way I implemented this to work with Rob's Geo and alephOne, it's basically a state machine.

For regular notes and CC's:
- In the voice allocator when a note-on is received, save the channel, note number and note frequency - move to the voice attack state
- When the note-off is received, match the channel and note number to the active voice - move to the voice release state
- For all CC messages, match the voice to the channel number of the CC - adjust the voice accordingly

For Bends:
- For all Pitch Bend messages, save this value per channel - apply bend to any active voice(s) on this channel
- When a note-on occurs for a channel, we set the voice's pitch bend to the current channel pitch bend, i.e. the note-on frequency is adjusted according to the prior bend message.

Note Tie:
- After the 4 CC's that make up the NRPN are decoded we simply set a flag to change the operation of the note-on. Save the tied note number, channel and note frequency. The note number and channel are in the NRPN.

- If the note-tie flag is set and we receive a note-on, then find the voice that matches the note-tie channel and note number. This voice is then reassigned the channel, note number and MIDI note frequency for the new note-on. In other words, the note-on stays on and we move it to the new channel and adjust it's frequency. Reset the note-tie flag.
This is the important step, because with Rob's system, rather than simply setting the note-on to the MIDI note number frequency, we need to bump up the current bent note which might now be bent beyond the original note plus the 14-bit MIDI bend range. Or it's fretless and the pitch it not on a typical chromatic frequency. This gives you continuous pitch bend, a la trombone or fretless guitar.

- If the note-tie flag is set and we receive a note-off, then simply ignore it

Finally we have to ensure that the bend range of the synth matches the bend range of the controller. This can also be done by responding to the Bend Range RPN which alephOne sends.

cheers
peter

Rob.fielding

unread,
Apr 21, 2012, 9:20:36 AM4/21/12
to open-music-app...@googlegroups.com, open-music-app...@googlegroups.com
a lot of detail! the DSPCompiler hides almost all of it. on the front, you specify it in terms of fingers down and floating point pitch. channels and bends are hidden completely:

onbegin f
expression f ccA valA
expression f ccB valB
onend f pitch,polygroup, vol

move f pitch vol
expression f ccA valA

off f


so that is the lifecycle of a finger as it goes up and moves around. not the on is split in half so that cc calues can be sent before midi note on. when noteon happens, bend is sent on that channel just before. if bend width exceeded, then note ties are sent to smooth it over automatically.

if there are multiple notes in a polygroup, which is like a string, the notes stack and get turned on and off for a solo mode. pick attack is allowed on first note in poly group, and note tie sent for all other transitions by default. but you can override per note to add pick attack on any note. we can use any of midi's built in facilities here because they dont anticipate the channel hopping. anyway, the polygroups let you trill up or trill down, which is an extension of actual string behavior that fits in with playing mad-fast legato passages, while chording at same time.

note that this legato style doesnt overlap midi notes. just turn on full poly and mash it all down to 1channel and let midi's sol mode do that.. (will be monophonic, but that is the price to pay.)

Tom White (MMA)

unread,
Apr 23, 2012, 4:47:26 PM4/23/12
to open-music-app...@googlegroups.com
Rob,

I'd just like to make sure I fully understand your method
before I talk to MMA members about it. Whenever you have a
some time to look at this.... no rush.

- TW

===================================================================
You provided this example sequence:

> ch1 bend +5%
> ch1 on 33 127
> ...
> ch1 bend +50%
> ch2 notetie
> ch1 on 33 0
> ch2 bend 0%
> ch2 on 32 127

I'm trying to understand what the user is doing that would
generate that sequence. So I have prefaced each event with
what I think was happening ("[description]") but the events
don't seem to match my description... so would you please
explain?

[User placed finger on screen, position is 5% higher than
MIDI Note 33, so you send BendAmt + NoteNum as a pair:]

> ch1 bend +5%
> ch1 on 33 127

[User is moving the finger, you send more and more Pitch
Bend, until Pitch Bend reaches 50%.]

> ...
> ch1 bend +50%

[Now you need to switch to a different Note Number now, so
you send a Notetie + OldNoteNum pair...]

> ch2 notetie
> ch1 on 33 0

--> I assume these two events go together, as a pair, so
the synth knows which Note is going to be transferred...
or is your system monophonic, in which case the Channel
numbers don't seem to really matter (as long as the new
one is different from the old one)?

--> Also, you are shutting down the old Note here, before
you've started the new one... is that what you meant to
do?

[... Followed by NewBendAmt + NewNoteNum pair, :

> ch2 bend 0%
> ch2 on 32 127

Wait... shouldn't that NewNoteNum have been 34, since you
were bending up??

> you can't do solo mode or legatoes in the same channel.

Yes, in MIDI there is a Mono Mode command which tells a
polyphonic synth to behave like a monophonic synth (on
one or more designated Channels) and play legato-style
anytime two Notes are played simultaneously. But that only
works for simultaneous Notes on the same Channel, which
doesn't help you because you have to move Notes between
Channels to extend the synth's bend range... I get that.

So, you made up an NRPN which you use to instruct a synth
whenever a new note is actually meant to be a continuation
of a prevous note... but I'm still not clear what the
structure of the complete MIDI instruction is. Is it:

--------------------------------------------------------
NRPN (where ChNum = OldNoteNumChannel + 1)
OldNoteNum Off (where ChNum = OldNoteNumChannel)
BendAmt + NewNoteNum On (where ChNum = OldNoteNumChannel +1)
--------------------------------------------------------

or is it:

-------------------------------------------------------------
NRPN (where ChNum = OldNoteNumChannel + {value})
OldNoteNum Off (where ChNum = OldNoteNumChannel)
BendAmt + NewNoteNum On (where ChNum = OldNoteNumChannel + {value})

where [value} is determined somehow (I don't know how)
-------------------------------------------------------------

I also assume you are sending the NRPN Number with Value=O?

==================================================================

Rob Fielding

unread,
Apr 23, 2012, 9:06:03 PM4/23/12
to open-music-app...@googlegroups.com
1) put one finger on the lowest note and start dragging it to the right
2) put another finger on the highest note and start dragging it to the
left (at the same time)
3) just keep dragging your fingers around left and right the screen
without picking them up
4) now imagine that our bend width is only set to +/- 2 semitones
max(!), but the screen can go from pitch of midi 0 on the left to midi
127 on the right.

That's the scenario that has to work to be viable on a touch screen.
The details are here. I will keep adding detail to this as much as
possible. For now, just think of this really weird scenario, which is
what every 2 year old does with Bebot when handed to him with scales
turned off. Any note down can generate any arbitrary pitch (depending
on where you touched the screen, what the fretting rules are). So,
midi bend is set before note is turned on. On a fretless surface,
just about every single note would start and end with a slightly
different value for pitch bend (and move around during its lifetime).

http://rrr00bb.blogspot.com/2012/04/ideal-midi-compatible-protocol.html

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 25, 2012, 12:23:15 AM4/25/12
to open-music-app...@googlegroups.com
Hi Rob,

> 1) put one finger on the lowest note and start dragging it to
> the right
> 2) put another finger on the highest note and start dragging it to the
> left (at the same time)
> 3) just keep dragging your fingers around left and right the screen
> without picking them up
> 4) now imagine that our bend width is only set to +/- 2 semitones
> max(!), but the screen can go from pitch of midi 0 on the left to midi
> 127 on the right.

If +/- 2 semitones is the wrong bend width for your application, why
not just change the Bend range to whatever works?

You should just set the Bend range to 127 tones, then any synth that
supports that range should work just fine. I don't know which synths
support such a huge range (certainly not wavetable-based ones) but if
they want to work with your app, that's all they'd have to do, and
you'd not have to do anything special at all, as far as I can tell.

> On a fretless surface,
> just about every single note would start and end with a slightly
> different value for pitch bend (and move around during its lifetime).

Setting the Bend range really wide and then assigning a different
Channel to each finger should work just fine. Sure, your way works,
too, but yours seems a lot harder, to me. So I'm wondering whether
there is something wrong with my approach, or if you just felt it was
better to take a different approach.

- TW

Rob Fielding

unread,
Apr 25, 2012, 12:47:28 AM4/25/12
to open-music-app...@googlegroups.com
There are multiple reasons

1) No bend width actually works because there is only 14 bits of
resolution. If you smear those bits across 48 semitones, then there
is a lot of frequency stepping. When doing microtonality, this pitch
stepping is much more problematic than usual. The difference between
two just whole tones ( 9/8 * 9/8 ) and a Just third ( 6/5 ) is very
noticeable in chords. It's a Pythagorean style Major third versus a
Just Major third. You will hear obvious beating in the fifths and
thirds if it's off. If the seconds are off then they aren't playable
as chords (just as with piano where they are so out of tune that they
don't get used as proper chords).

2) If you broadcast the same MIDI stream to a bunch of synths
simultaneously (what drove a lot of my design), you want the synths to
all do the most reasonable thing they can do for the sake of backwards
compatibility. Otherwise, you have to generate a COMPLETELY different
MIDI rendition for every instrument that will hear it (more
configuration headaches and unnecessary setup). Say that the
controller is doing a violin-like thing in the foreground and
broadcasting the MIDI to lesser synths. So all these synths listening
have slightly different problems:
a) pianos that can't bend will just play the note without bends.
What you want with a piano in most cases.
b) a whole gaggle of crappy synths *cant* change bend width ( and
are therefore stuck at 2) - like GarageBand's Electric guitars.
c) a multi-timbral that will bend and get pitches right, and can do
independent bending, but will get note-breaks at max bend width
d) the foreground synth understands note ties and just bends all
over the place without any breaks in the tone

3) Anything that can happen on a touch screen must be representable,
or the protocol is just broken. If I slowly move my finger up from
MIDI note 0 a few cents per second to 127, or play an entire song
without ever picking my finger up, then that's a perfectly valid thing
to do. Doing this with many fingers on one instrument is perfectly
valid.

When doing backwards compatibility with older synths (like my Korg
Karma), I do exactly what you suggest, using +/- 12 semitones. When
using that backwards compatible synth, I am then missing the ability
to chord and legato at the same time (perfectly normal string
instrument behavior).

--
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 25, 2012, 12:52:37 AM4/25/12
to open-music-app...@googlegroups.com
err... (9/8 * 9/8) versus 5/4 :-) ... anyways, the point is

1) i guess we have to allow changing bend width for backwards compatibility
2) but changing bend width is a horribly bad thing. the default
should have been 1 semitone and note-ties added instead. that way a
piano hearing the MIDI stream can do the chromatic rendition, while
fully capable voices do all of the exact pitches, etc.

--
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 25, 2012, 1:25:55 AM4/25/12
to open-music-app...@googlegroups.com
as a thought experiment... use MIDI note 0 as the only note. :-) and
set bend width to +/-127 semitones. (can you even represent that? :-)
)
so, that's 8192 steps for the full range. roughly 819 steps per
octave, exactly 64 divisions per semitone.
(Note that it doesn't fall exactly on octaves or fifths, but it's a
reasonably high resolution).

eh... maybe. 665 tones is for all intents and purposes a perfect
Just circle of fifths...

The main issue is still the scenario where you want to create 1 MIDI
stream and broadcast it to many devices of varying quality. Some of
those will be stuck at +/-2 semitones, which is why I started with
note ties. (And if you try to make it anything else, then you get
into the issue of having to get setup all of those synths, as full
polypyhonic bending doesn't work by default)

--
http://rrr00bb.blogspot.com

Rob Fielding

unread,
Apr 25, 2012, 1:35:55 AM4/25/12
to open-music-app...@googlegroups.com
another thought experiment that says you still want note ties:

1) represent a finger simply bending up
2) but you know that multiple devices will hear the MIDI stream.
a) one is a synth that just plays exact pitches
b) the other is a piano

you render MIDI notes that are in key for the piano (ie: some half
steps, some whole steps) and use note ties to represent the exact
bends. the piano is playing a "sheet music" version of the bendy
stuff that it can't follow exactly.

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
Apr 25, 2012, 2:53:56 AM4/25/12
to open-music-app...@googlegroups.com
Hi Rob,

Thanks for continuing to be patient with me...

> 1) No bend width actually works because there is only 14 bits of
> resolution. If you smear those bits across 48 semitones, then there
> is a lot of frequency stepping.

But isn't that still less stepping than the controller generates? At
2048 pixels (on the new Ipad), playing all 128 MIDI notes will only
leave you 16 physical steps between notes on the touchscreen, whereas
14 bits of Pitch Bend gives you 8192 bend values (in one direction)
which applied to 128 notes leaves you 64 steps between notes... 4 times
better. So I'm not seeing why you need more Pitch Bend resolution than
you can generate.

> When doing microtonality, this pitch stepping is much more problematic
> than usual. The difference between two just whole tones ( 9/8 * 9/8 )
> and a Just third ( 6/5 ) is very noticeable in chords. It's a
> Pythagorean style Major third versus a Just Major third. You will hear
> obvious beating in the fifths and thirds if it's off.

Ohhh. I see the problem now. You also want to create alternate tunings.
Pitch Bend isn't accurate enough to create alternate tunings.

MIDI does have Tuning messages though, and as long as you aren't trying
to change the instrument tuning on the fly, they're a relatively easy
solution to implement. What's wrong with that approach?

> 2) If you broadcast the same MIDI stream to a bunch of synths
> simultaneously (what drove a lot of my design), you want the synths to
> all do the most reasonable thing they can do for the sake of backwards
> compatibility. Otherwise, you have to generate a COMPLETELY different
> MIDI rendition for every instrument that will hear it (more
> configuration headaches and unnecessary setup).

I agree... ideally you would like acceptable degradation - the things
that can't play right at least play something acceptable...

> Say that the controller is doing a violin-like thing in the foreground
> and broadcasting the MIDI to lesser synths. So all these synths
> listening have slightly different problems:
> a) pianos that can't bend will just play the note without bends.
> What you want with a piano in most cases.

I would be surprised if there is any MIDI Piano that doesn't Pitch Bend,
so I'm not sure you can include them...



> b) a whole gaggle of crappy synths *cant* change bend width ( and
> are therefore stuck at 2) - like GarageBand's Electric guitars.

Okay, I see that you can include them with your method, so if that's
your objective, I understand going that way... but to be clear, what
I heard you say before is that your method was necessary because of
*MIDI's* lack of sufficient Pitch Bend, whereas this work around is
for *Crappy Synths*...

> c) a multi-timbral that will bend and get pitches right, and can
> do independent bending, but will get note-breaks at max bend width

I'm sorry to keep coming up with more questions, but please explain
that to me... By "note breaks at max bend width" do you just mean it
doesn't support the full Bend Range you need?

> 3) Anything that can happen on a touch screen must be representable,
> or the protocol is just broken. If I slowly move my finger up from
> MIDI note 0 a few cents per second to 127, or play an entire song
> without ever picking my finger up, then that's a perfectly valid thing
> to do. Doing this with many fingers on one instrument is perfectly
> valid.

Okay... I agree with the theory... when touchscreens can play every
MIDI Note from 0 to 127 a few cents at a time, there won't be enough
Pitch Bend to render that, and we'll need a solution. We envision HD
Protocol for that situation, not the system you've created... which
is one reason why I want to better understand your system.

> When doing backwards compatibility with older synths (like my Korg
> Karma), I do exactly what you suggest, using +/- 12 semitones. When
> using that backwards compatible synth, I am then missing the ability
> to chord and legato at the same time (perfectly normal string
> instrument behavior).

Using Pitch Bend on a Poly Mode synth, you should be able to get both
legato (by bending) and chords (by putting each Note on it's own MIDI
Channel). If you can only get +/- 12 semitones of Bend, then you can't
touch just anywhere and move just anywhere without also retriggering,
but I don't see anyway to fix that: the Karma isn't going to support
your NoteTie event, either.

- TW

Tom White (MMA)

unread,
Apr 25, 2012, 3:15:03 AM4/25/12
to open-music-app...@googlegroups.com
> set bend width to +/-127 semitones. (can you even represent that? :-)

I didn't check... possibly not :-} ...

> so, that's 8192 steps for the full range. roughly 819 steps per
> octave, exactly 64 divisions per semitone.
> (Note that it doesn't fall exactly on octaves or fifths, but it's a
> reasonably high resolution).
>
> eh... maybe. 665 tones is for all intents and purposes a perfect
> Just circle of fifths...

Yeah, but I keep coming back to you not being able to generate more
than 2048 frequencies anyway, at least not with current devices...
So I think there's enough resolution for you in MIDI Pitch Bend, if
synths can support that.



> The main issue is still the scenario where you want to create 1 MIDI
> stream and broadcast it to many devices of varying quality. Some of
> those will be stuck at +/-2 semitones, which is why I started with
> note ties. (And if you try to make it anything else, then you get
> into the issue of having to get setup all of those synths, as full
> polypyhonic bending doesn't work by default)

Yes, if the requirement is to make it work with synths that only bend
+/- 2 semitones, then my proposal won't work. I might argue that the
requirement should be abandoned, however :-)...

- TW

Tom White (MMA)

unread,
Apr 25, 2012, 3:15:03 AM4/25/12
to open-music-app...@googlegroups.com
> changing bend width is a horribly bad thing. the default
> should have been 1 semitone and note-ties added instead.

Changing is best, if every synth does it. Since they don't, you can't
use every synth, so you don't like it. I understand completely.

Making every synth inflexibly identical would work, for you, but
then it might not work for someone else.

In MMA we try to only standardize what has to be, and let everyone
do everything else however they want, so there is always room for
new applications and products and perspectives. As a result, you
can't count on every synth being the same... but we think that is
better for everyone, in the long run.

> that way a
> piano hearing the MIDI stream can do the chromatic rendition, while
> fully capable voices do all of the exact pitches, etc.

That's an interesting thought... but as I said in the last email, I'm
not sure that will ever happen, as I think all MIDI Pianos still respond
to Pitch Bend (even though you might expect they wouldn't, since it's not
a natural control for Piano).

I think in HD Protocol, there are two pitch models, one for chromatic
instruments (a la MIDI) and one where you specify the frequency; and
the frequency message still contains MIDI Note Numbers, so if a device
doesn't do frequency (e.g. Piano) then it can play chromatically...
so I think HD has a good mechanism for doing what you want, although
the backwards compatibility part still needs to be solved (namely, what
MIDI message would correspond to the HD Frequency message?).

- TW


Tom White (MMA)

unread,
Apr 25, 2012, 3:25:07 AM4/25/12
to open-music-app...@googlegroups.com
> 1) represent a finger simply bending up
> 2) but you know that multiple devices will hear the MIDI stream.
> a) one is a synth that just plays exact pitches
> b) the other is a piano
>
> you render MIDI notes that are in key for the piano (ie: some half
> steps, some whole steps) and use note ties to represent the exact
> bends. the piano is playing a "sheet music" version of the bendy
> stuff that it can't follow exactly.

Except you are still using a Pitch Bend message (along with the NoteTie)
so unless the Piano is a Acoustic Piano (with MIDI) it probably will Bend.

But yes, if it is an Acoustic (like a Yamaha Disclavier), it will play
the closest note to the intended pitch, without bending, which may be
preferable to playing only one note all the time (my way). But I think
I'd like to hear that before I decide your way is actually better :-).

- TW

Rob Fielding

unread,
Apr 25, 2012, 10:50:49 AM4/25/12
to open-music-app...@googlegroups.com
Tom,

I think I see it clearly now. The first thing to note is that this
whole mess is created by trying to deal with various levels of
compliance, especially when sending 1 MIDI stream to multiple devices
at the same time. So:

1) Ideally, every voice is in its own channel. If this is not
possible, then multiple voices get jammed into the same channel, and
pitch bend value is for the last voice to be added into the channel.

2) Ideally, the bend width is high enough that we NEVER exceed the
maximum bend width. The note-tie mechanism only exists as a mechanism
to handle the exceptional condition of exceeding the bend width. It
happens all the time if you are at +/- 2 semitones for instance. It
rarely happens at +/- 24 semitones (what a lot of synths support).

So given that those two things are there to handle exceptional cases
caused by non-ideal synthesizers, if we just never violate rule 1 or
rule 2 then the implementation is simple. We just don't support more
than one voice per channel and just have giant bend widths on ideal
instruments; which means it's less backwards compatible. (ie: note
ties only happen when we exceed bend width limits, so it doesn't need
to be implemented if you can't.)

A string instrument that never needs to exceed bend width can just
assign a fixed string per channel and use bends for all legato. A
hammered instrument in which every note potentially has its own bend
would still need to hop around many more channels (chording and
releasing voices). But this is all determined at the controller (not
the synth). The current complexity is caused by my attempt to
broadcast to all of these devices simultaneously and have them all
produce pitches that are as close to in-tune with each other as they
can be.

3) If anything new is to go into MIDI, it must be some kind of
capability negotiation mechanism to automate setup. Ex:
a) Positive ack that bend width change will be understood
b) Positive ack that we can use note+bend to get exact pitches
c) If no answer, then we must be talking to something old,
and expect degraded service. You can start a negotiation at any time.
d) Capabilities imply special sets of NRPNs and SysEx
messages that will be recognized
e) On a tablet device, the synth is in the background,
which makes its controls physically inaccessible (unlike hardware).
One of the capabilities will need to be a way to offer the
critical real-time controls
with their names, current settings, and rendering hints (knob/slide/xy).
f) Looking for devices and vendors doesn't scale.

Maybe a capability can be thought of as user created submodules to the standard:

1) "I understand" versus "I speak" is specified
2) A grammar of messages (a sub protocol) that will include NRPNs and SysEx, CC
3) Documented expectations for behavior in response to the protocol.

So, if I ship an instrument that does little more than respond to
0x90, then I can respond that this is my only capability when I am
asked. That way I won't send anything that creates bizarre results in
the synth.

As an example of my frustration while working on AlephOne: Geo can be
setup to work pretty well with my Korg Karma for a violin with 4
fretless strings. AlephOne is based on its MIDI. Once I started
putting in channel pressure to update volume in response to
finger-area changes on the surface of the tablet, I started getting
bizarre results. AlephOne is wonderful against ThumbJam, and I have
total control for my internal synth.

The inconsistency everywhere drives me nuts. If you think about the
fact that every iOS instrument writes both a controller and a synth,
then take it as a signal that something is DEEPLY wrong. If MIDI
worked really well for this scenario, then most people would just
write a controller or a synth and not waste time duplicating effort.
But we end up duplicating everything (an obscenely expensive effort in
most cases) to avoid setup and inconsistency hassles.

I think the "up to the implementer" design is the main cause of this
headache. We get that instead of well-defined semantics (knowing what
frequency will be emitted in response to a note+bend pair), a
well-defined path to degraded functionality (ie: capabilities),
user-defined extension negotiation. In programming languages,
everything is well defined so that it is reliable; and things are
combined to produce results that nobody envisioned. We don't, say,
define the semantics of the language very loosely so that all the
different implementations can provide a variety of results.

--
http://rrr00bb.blogspot.com

Tom White (MMA)

unread,
May 7, 2012, 6:24:06 PM5/7/12
to open-music-app...@googlegroups.com
FYI...
 
I mentioned to Apple's MMA representative the idea of MMA
assigning a SysEx ID for iOS apps to use to query features
and was told that would probably not be Apple's preferred
approach, and that you all should "have a look at this year's
WWDC".
 
- TW
 


From: open-music-app...@googlegroups.com [mailto:open-music-app...@googlegroups.com] On Behalf Of Tom White (MMA)
Sent: Thursday, April 19, 2012 10:35 AM

Rolf,

I just did two posts about the idea of sharing an OMAC ID. 
Only one little remark here: Asking Apple to reuse their ID
is not working, because Apple does not respond to anything
we ask.
I understand, but Apple is an MMA member, and they respond to
us... and since this is their technology (not MIDI, but CoreMIDI and
iOS) we think they should have the opportunity to consider how they
want this accomplished, if they even care... keep in mind that since
they control the platform they also could someday decide they do
not want this to work and so create problems, so we need to make
sure they are on board, for your own benefit <g>...
Sure, this does not mean that we have trade
secrets. But this group is not the place to post them ;-)

From this fact, we may see upcoming things as proto-standards 
under the umbrella of an OMAC SysEx ID. However, there is no  
reason why these could not be a base for a new formal MMA  
standard. 
MMA does not need to create every standard internally or in secret.
We can also adopt something created outside.
 
We work with numerous organizations to help them decide how to
use MIDI, and sometimes they publish the documents and other
times we do. Consider RTP-MIDI, for example.
 
We are primarily interested in making sure MIDI is done right. You
might be surprised how often people think they know what they are
doing with MIDI, and don't really...
 
But we are also interested in making sure that the right solution is
able to be promoted fully... OMAC has been very successful, but
because it is an adhoc group, it might not always exist (and, for
example, Apple does not participate) so certainly we think it may
be good for MMA to formally adopt whatever OMAC comes up with.
 
- TW

nlogmusic (TempoRubato - NLogSynth)

unread,
May 7, 2012, 8:02:23 PM5/7/12
to open-music-app...@googlegroups.com
Yeah, that's what I expected. They won't share anything.

But the hint for WWDC is interesting although I was unable
to get a ticket. If anyone who attends can post something
here that would be great (unless they put all WWDC 
members under NDA ;-)


Am Dienstag, 8. Mai 2012 00:24:06 UTC+2 schrieb MMA:
FYI...
 
I mentioned to Apple's MMA representative the idea of MMA
assigning a SysEx ID for iOS apps to use to query features
and was told that would probably not be Apple's preferred
approach, and that you all should "have a look at this year's
WWDC".
 
- TW
 

Rob Fielding

unread,
May 7, 2012, 9:12:19 PM5/7/12
to open-music-app...@googlegroups.com
If it doesn't run over the midi byte stream.... (it is just a byte
pipe, and the hardware is on the other end of it)....

/me scratches his head
--
http://rrr00bb.blogspot.com

Aaron Pride

unread,
May 7, 2012, 10:39:00 PM5/7/12
to open-music-app...@googlegroups.com

Hoping for capability to host AUs and wrap apps as AUs in iOS6!

Reply all
Reply to author
Forward
0 new messages