Hello,
I am quite curious to discover a way to use ONYX-VJ in a multiple
screen and server environment. Currently I've got a few server/
projector nodes that I'd like to control via remote interface. I'm
wondering if there could exist an interface to ONYX-VJ that would, in
essence, split the video processing and UI/UX into client/server
pairs. Much in the same way the midi interface currently exists but,
on a network-level, implemented in a more flash-native format (netcat?
xmlsocket?). Seems like, by exploring through some of the googlecode
repository, that most of the UI controller stuff could be detached,
routed over the network, and reattached to another AIR instance of
ONYX-VJ?
I've considered & explored the possibility of running a Red5 server
and sending streams (http, RTP) from ONYX-VJ. This introduces too
many impediments to be a viable option, I'd really like the content to
be played back from a local/server source.
P.S. Re:
but you can support future versions of Onyx by donating and/or let us
know your feature requests. you have all been so 'stingy' that onyx-vj
developers wont get
involved in bringing you anything in the future, so bye bye, get
yourself prepared to buy VJ software in the future
but you can support future versions of Onyx by donating and/or let us
Please don't delete your
code.google.com project! And please setup a
paypal donate link on
onyx-vj.com!
--
edwardsharp.net