I am currently converting some audio files with onset on frames. I noticed that I got different MIDI results with the JavaScript app than with the Colab notebook. Do they use different checkpoints or is it due to some parameters?
--
Magenta project: magenta.tensorflow.org
To post to this group, send email to magenta...@tensorflow.org
To unsubscribe from this group, send email to magenta-discu...@tensorflow.org
---
To unsubscribe from this group and stop receiving emails from it, send an email to magenta-discu...@tensorflow.org.