Ableton Live Lite Getting Started

0 views
Skip to first unread message

Fidelia Boldul

unread,
Aug 3, 2024, 1:26:59 PM8/3/24
to cirgachende

In my opinion, all the theory in the world is useless without being able to HEAR the results. So my suggestion is to start your reverb development path by setting up one or more development environments. Ideally, you want a place where you can hear your work in near-real-time, and run the audio of your choice through your algorithms.

Great! My recommendation is to download the Juce SDK. Juce is the framework used by many plugin developers to create their plugins. It is used to handle both the audio and visual parts of the plugins, has target for all of the popular plugin formats (AU, VST2, VST3, AAX, Mac, Windows, Linux, iOS, Android), and is free and open-source to get started.

Once you get a plugin you want to release commercially, Juce has several options to pay for using the code for closed-source plugins. I would highly recommend installing the Juce SDK, compiling the example plugins, and modifying the example plugins as the start to your own plugins.

My suggestion is to work with a computer music language/environment, so you can start learning the fundamentals of digital signal processing and algorithms without having to understand code. When I started programming reverbs in 1998, I was using Csound. This was an old school language even in 1998, but it had all the building blocks I needed to make reverbs:

Today, you have many real-time options to experiment with these fundamental computer music building blocks. Most modern music DSP environments use a visual environment to patch signal processing modules together in a similar way to an analog modular synthesizer.

Working in a visual DSP language can be much faster than getting up and running versus a text-based DSP language. It is much quicker to prototype a simple reverb in Max/MSP or Max4Live versus creating an entire DSP and GUI code base in C++.

this is incredible. I was thinking about this topic a lot, downloaded the VST SDK years ago but have been scared away by the sheer complexity of the VST 3 standard. IPlug2 or JUCE was in the pot and DSP stuff was the most scary part. Thanks a lot for sharing your knowlegde !

I appreciate your comment on the pile of tangled necklaces. I have hundreds of hours into a PD implementation of something like the Echoplex Digital Pro under my belt, and I can barely understand it, let alone explain it to someone else, after letting it sit for a few years. That said, being able to route sound through it and test ideas in near real time afforded me a comfortable development environment where experimentation drove the development process.

I heard you on a couple of YouTube videos and podcasts, your enthusiasm for your work is incredibly contagious.
These blogs are awesome I hope you keep doing things like this. I come to them as a long time effect enthusiast that has gotten more and more interested into DSP. I have recently read both volumes of Musimathics and I am trying to learn DSP without any prior experience in programming. Your materials, books of reference and ideas on how to approach learning are super valuable. Thanks!

TDAbleton is a tool for linking TouchDesigner tightly with Ableton Live. It offers full access to most everything going on in an Ableton set, both for viewing and setting. The TDAbleton system contains a number of components for 2 way communication, and a framework for building custom components and new features.

TDAbleton operates through Ableton's MIDI Remote Scripts system and, when necessary, Max for Live (M4L) devices. Communication with TouchDesigner is via OSC (using UDP). It is fully network-capable, so TouchDesigner can be running on a separate machine from Ableton. The Python in TDAbleton extensively uses Ableton's Live Object Model. Much of the groundwork for this Python interface is based on research done by Julien Bayle's Structure Void.

This sets up Ableton Live Remote Scripts and User Folders for standard installs of Ableton Live. If you have customized your Live install locations, you may have to set up the MIDI Remote Script Folder and Preferences Folder parameters yourself. Do this on the tdAbleton master component or on the tdAbletonPackage (leave the parameters bound). Instructions for finding the correct locations can be found here for the remote script folder and here for the preference folder. If there are problems with the install, you should see popup dialogs that will give you instructions and a "Folder" button to open the necessary folders for manual copying/deleting.

These are instructions for updating a previously installed TDAbleton. You will have to make changes to both Ableton Live and your TouchDesigner project. All TDAbleton components have a version number on their TDAbleton parameter page which you can check against the version numbers in the palette package. The latest update is always available in the TouchDesigner palette.

When you install a new version of TDAbleton, you may have to replace TDA Max devices stored locally in your Live Set. This happens when you use the Collect All and Save feature in Live, because it creates local copies of the devices. To make sure you have the proper versions of all TDAbleton Max 4 Live devices, do the following in your Live Set:

The easiest way to learn the basics of TDAbleton is to explore the provided demo. To get started, run the TDAbletonDemo.toe file in the /Samples/TDAbleton/ folder (choose Browse Samples from TouchDesigner Help menu). In the same folder you'll find the TDADemo Live Set for Ableton Live. Inside that folder, open the TDADemo Set.als Ableton Set. Press play in Ableton Live and you should immediately see CHOP data moving in the TouchDesigner demo TDAbleton Components. If you don't, be sure you have properly set up Ableton Live (see Getting Started).

On the Master track of the TDADemo Set, you'll find the TDA Master Max Device. This device shows if Live is connected to TouchDesigner and allows you to apply some master settings for your Live Set. TouchDesigner does not have a way to access the file name loaded into Live, so the Song ID is provided as a numeric field you can use to identify your song. The name of your TDA Master device will be used as a text song name that is readable by TouchDesigner. The Ableton Port is the network port that will be used by the TouchDesigner remote script to receive messages from TouchDesigner. The Max In Port is the network port that will be used by TDA Max devices to receive messages from TouchDesigner.

You will notice that the data shown in the viewer reflects the current state of the Ableton Live Set. You can see a number of channels reflecting time data (e.g. song/info/beats) and a few others reflecting various states (e.g. song/info/play). Data coming in from Live is generally CHOP data, and can be accessed by wiring from the TDAbleton Component, as has been done with the nullSong CHOP. Look in the parameters of the circle1 SOP to see one way of using this incoming data. Some data from Live come in DAT format, and can be accessed similarly from outputs.

In certain cases, incoming data may also come via callback. Examples of this can be found in the abletonSong1_callbacks DAT. Python callbacks are beyond the scope of this guide, but if you know a little Python, looking in this DAT will reveal the examples which set the locatorByCallback and sceneByCallback textTOPs.

Data to be sent out to Ableton is usually sent via parameters. If you go to the Ableton Song parameter page of abletonSong1 you will see the Play, Loop, and Tempo parameters which set the corresponding values in the Live Set. The abletonChain1 component shows examples of using CHOP exports to automatically change an outgoing parameter and thereby change a value in Live.

These parameter values are kept up to date with incoming Live data only if the Auto Sync Pars To CHOP toggle is on. This option is provided because in certain cases Auto Sync can cause echoing changes between TouchDesigner and Ableton.

The Ableton Live Object Model (or LOM) is an interface to all the aspects of an Ableton Live Set, including Tracks, Devices, Parameters, Scenes, etc. For detailed information about the Live Object Model, see:

In the previous section, abletonSong1 shows data for the entire Live Set. Most TDAbleton Components are made for observing particular parts of a Set. For example, abletonParameter is used to get and set the value of a single Ableton Device Parameter. As you can see in the Component's parameters, this one is set to work with Track: 1 Muugy, Device: Pitch, and Parameter: Pitch. In Ableton Live, navigate to that device and you will see that its Pitch value is being mirrored in TouchDesigner.

TDAbleton uses menu parameters to navigate the Live Object Model. For example, all available Tracks, including Returns and the Master, will be shown in the Track parameter. Once you have selected a Track, its available Devices will be shown in the Device parameter, and so on down. To see other examples of this, take a look at the abletonTrack1 Component, which observes a single Track, and the abletonChainParameter1 Component, which gives access to Ableton Device Parameters within sub-chains such as those in an Instrument Rack.

Note: You can change the Pitch value in Ableton by changing the Value Send parameter on abletonParameter1, but notice that this stops Ableton's automation of that parameter. This is another reason why Auto Sync is not always desirable.

The abletonMIDI1 Component has a unique feature: a Max For Live device is necessary in order to get MIDI data out of Ableton Live. Each abletonMIDI component is connected to a specific TDA MIDI device in Live. If you look on the 1 Muggy Track in the Ableton Set, you will see the TDA MIDI device.

TDA MIDI devices in your Live set should be created from TouchDesigner by using the Add TDA MIDI Device pulse parameter on an abletonMIDI Component. Just select the Track to put it on and press that button.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages