Hello Magenta team and community, I am introducing Clip Evolver, a new Max for Live device using Magenta.js’ MusicRNN and MusicVAE algorithms. It’s a performance tool which updates a looping MIDI clip regularly by interpolating between continued sequences. It features a pitch/octave filter and velocity/probability sliders. It can be downloaded here:
https://github.com/FlexCouncil/Clip-EvolverWould it be possible to integrate the trained models within the amxd file itself (eliminating the need for an Internet connection during performance?) Sometimes I get a “socket hang up” error when trying to fetch the checkpoint. I downloaded the models to my local machine but I couldn’t figure out how to call them from the code. Also, is there a JavaScript implementation of Music Transformer? That might give better results than MusicRNN+MusicVAE.
To run the device, drag the Clip-Evolver.amxd file into your Live track and select the “Power” button—click the “Restart” button if the device hasn’t started working after about a minute. One caveat: unfreezing the device in the Max editor breaks the reference to the Node script, but you can still edit the unfrozen version (
https://github.com/FlexCouncil/Clip-Evolver/blob/main/Clip-Evolver-Unrozen.zip).
I’ve also been working on an update to the 48kHz/stereo version of DDSP. Sorry this update took so long—it took a while to get the details right. Here’s the gin file:
https://github.com/FlexCouncil/gin/blob/main/solo_instrument_noz.ginAnd the Colab notebook:
https://colab.research.google.com/github/FlexCouncil/DDSP-48kHz-Stereo/blob/master/ddsp/colab/ddsp_48kHz_stereo1.ipynbAnyway, I can’t wait for the DDSP plug-in. Thanks for all the great code!
Josh