What did I expect to happen with "<key code="65534" output="◊"/>"?
In the Chrysalis software for the keyboard, I configured a key, the "B" key (in the 3rd layer), to send a "Custom key code" of 65534, at least that is what I think I've done. (From what I can tell, the Atreus keyboard is essentially an Arduino computer, that can be configured to do just about anything.)
Then I put those lines, manually, in a keylayout file.
That freaked out System Preferences, nearly crashed it, as far as I can tell.
I would have been surprised if it worked, but I as hoping I would get a "◊" when I pressed the "B" key after activating the 3rd layer on the keyboard.
[Aside: I have my keyboard configured to activate the 3rd layer by holding the Fn key, pressing the Esc key, then hitting the Fn key again. Hitting the Esc key returns to the keyboard to the base, 0 layer. I'm sure this works, though there is probably a much better way to set this sort of thing up.]
The real question?
What I'm ultimately hoping to accomplish is to be able to enter glyphs for APL and BQN computer languages quickly and efficiently from my Atreus keyboard. The
BQN project has a
keylayout file that looks like it was generated by Ukelele, which is how I found out about Ukelele, and layout files. I think the BQN-provided file is intended for a traditional keyboard that will be used in gui editor. I mostly work in a terminal and disable the Apple configured fancy keyboard stuff in order to have access to things like the Meta (Alt) key for emacs and readline.
As a little test, I set layer-3 keys: A, S, D to X, Y, Z, and B to "Custom key code" 65534. Using Ukelele info, A, S, D worked as expected, giving X, Y, Z, but B didn't register anything (the "Key Code" utility reported the same results). I also tried setting B to send 511, but that had the same behavior as with 65534. Maybe that is because I don't have anything entered in a keylayout file?