| Overview | Usage | Platform | Installation | Setup | References | Contribute | Acknowledgements | Licence |
PLEASE READ CAREFULLY THE USAGE, INCLUDING PLATFORM SUPPORT AND THE LICENCE BEFORE DOWNLOADING, INSTALLING OR USING THIS TOOL!
The spatial performance tool Live 4 Life aims to simplify the creation and control in real time of masses of spatialised sound objects on various kinds of loudspeaker configurations (particularly stereo, quadriphonic or octophonic setups, as well as domes of 16, 24 or 32 loudspeakers...) with several controllers (currently, you can control it with the GUI and the keyboard, and you can also add an Akai APC Mini, 2 tablets with Lemur App, up to 3 MIDI Fighter Twister and a Sensel Morph).
I have been developing in SuperCollider since 2011, "to play the place and the music at the same time". Although I hope to develop the tool with other spatial controls and algorithms during the rest of my life, rhythm and synthesis are now prioritised over spatial development, which is a bit paused because of several reasons (mainly due to the pandemic and the difficulty to find residencies or perform spatial improvisations in concert halls or festivals without an appropriate allowance). Also, I am starting a new setup based on controlling live coding that can accompany this project (to be released soon hopefully).
What makes the difference with other spatialisation tools ?
- It is not only a spatialisation tool, but a whole sound creation system to play with sequences of parameters (rhythm, sound and space), integrating spatialisation at the heart of each sound synthesis and loudspeaker.
- Contrary to most tools with input/track-based spatialisation, it includes a layer- and event-based spatialisation (see section 1 of the Wiki page Spatial Objectives), where sequences of spatialised sound particles or choruses (copies or echoes of the same event with micro-delays or spectral/spatial variations) meet a multichannel effect system.
- It includes a library of different abstract and concrete spatialisation techniques and rendering algorithms (see section 1.2 of ICMC 2018 paper or section 3 of the Wiki page Spatial Objectives for definitions, and the figure 4 of ICMC 2021 paper or the Wiki page Spatial Modules for the details of the spatial structure), mixing channel- and object-based paradigms to be used on every sound event (to take advantage of the strengths of each approach). It has also the effect that sequences integrating channel-based or a multichannel effect system are different according to the number of loudspeakers available and cannot be reproduced in the same way on different loudspeaker configurations.
- It is first thought for spatial performance with several global, high-level (indirect) control strategies, e.g. by swinging among scenes of spatialised events or by changing masses of parameters, particularly playback speeds with different mappings according to each controller (see section 3 of the article in 2021 Organised Sound).
The performance tool in context with all its controllers in 2021
One of the views of the GUI to choose among dozens of sequences and global parameters
Another view of the GUI to compose sequences of parameters of spatialised sound events
Please note that:
-
⚠️ although the code is available here, the interface and the setup are relatively complex, as this tool includes a lot of extensions and that it is not “Plug and Play“ and meant to be a simple Graphic User Interface (GUI) for a casual, untrained user of SuperCollider, but focused to allow the creation of a lot of combinations tailored to my creative dreams to map and improvise sound with space of speakers. -
due to the fact I almost began learning SuperCollider with this experimental project and that I am not a professional developer, the code is relatively raw with lots of old comments and I have developed over time my own coding strategies, which are not recommended by some great developers in SuperCollider and which might be old, bad or unoptimised. Even though there may be some bugs or errors, particularly during the setup process (causing to reboot), the tool generally works well for me with my workflow during performances. But I cannot guarantee it will work for you the way you want.
-
changing drastically effect parameters can produce very loud sounds. So, monitor the volume and change sliders and parameters slowly if you do not know the effect or what (kind of parameter) you are changing.
-
currently, the code can be incorrectly highlighted in Github due to a bug, but it is ok in SuperCollider.
Live 4 Life can play on all platforms, but is optimised for macOS. However, it may require a powerful and quick computer to boot correctly, since this tool tests the limits of the machine with SuperCollider. I have remarked that the loading of sound buffers is very slow on Windows and Linux, requiring more patience. (For this reason, I have prepared a light-weight folder of sound files you can download at the end of this section.)
It has been mainly tested with macOS 10.14.6 Mojave on a MacBook Pro 15" and with macOS 12.6 Monterey on a MacBook Pro M1 16". It was originally designed for both Mac 15" specific screen size (1920×1200), but the GUI automatically adapts or scales to any screen size. It is recommended to select the biggest screen resolution you can, like e.g. 2056×1329 on Mac 16" screen size. Shortcuts are currently conceived for an AZERTY keyboard (Français - Numérique), but other keyboard options will be added in the future.
On Linux, it plays, although I have noticed some graphic glitches. The reason why I do not switch from Mac to Linux is that I often used Dante to send multiple channels via ethernet in some concert halls. Since Dante virtual sound cards are not available for linux, you need to buy specific expensive sound cards to use Dante.
_0B_Init_Config.scd
. To obtain a list of available input and output audio devices, evaluate both lines of code: ServerOptions.inDevices.collect({|m| m.postln});
and ServerOptions.outDevices.collect({|m| m.postln});
. Use an appropriate driver like ASIO, ReaRoute, DirectSound or WASAPI, but not MME (see this post on the SuperCollider forum scsynth.org for more details).
Follow the steps below one after another:
-
Download and install SuperCollider 3.13 or above.
(For beginners in SuperCollider, FYI to evaluate a code in parentheses, particularly in the setup process, you have to be inside the parentheses and press Control (on Windows) / Command (on Mac) + Enter (for more details go to the menu
Language
). To evaluate a line of code, you have to be on the specific line and press either also Control (on Windows) / Command (on Mac) or easier shift + Enter.) -
Download and put sc3-plugins in your SuperCollider Extensions folder.
(Go to the menu
File -> Open user support directory
. Create a folder namedExtensions
, if it is not already there, and put your plugins folder into it. Rename the foldersc3-plugins
byplugins
.) -
Download the latest release of this project to begin the setup of the tool for the following steps.
-
Install the Quarks mentioned below, which are extensions of the SuperCollider language, with one of both methods below (If you are a new user of SuperCollider, I recommend the second method to go quicker):
-
Slow clean install: You can either download all the quarks and put them in
user support directory
or install git, which seems not to work well on Windows according to this post on SuperCollider forum. Please check the procedure on the SuperCollider Quarks webpage. Then, go to the menuLanguage -> Quarks
, click on the buttonCheck for updates
and select each of the quarks below by clicking on the box with the cross. -
Quick raw install: If you do not succeed or want to install git and the Quarks mentioned below with the normal way above, especially on Windows, or want to quickly install all the necessary Quarks files, unzip the file
L4L_ExtensionsQuarks.zip
within this project folder you have just downloaded, and then put the folder unzippedL4L_ExtensionsQuarks
into yourExtensions
folder.
- adclib (for adcVerb),
- atk-sc3 (for ambisonic spatialisation: currently only FOA is used, but HOA-ATK is planned be updated in this project in the future. This Quark will also install automatically other Quarks, like e.g. wslib for GUI, Mathlib or XML.),
- Automation (for saving and recalling actions on main GUIs),
- Bjorklund (for Euclidean algorithm),
- Connection (for MVC and NumericControlValue),
- crucialviews (for GUI BoxMatrix),
- Ctk (for Sam Potter extensions and chaotic envelopes),
- FPLib (for functional programming to get back to previous presets. This Quark will also install automatically JITLibExtensions and Modality-toolkit for some MIDI controllers: if you have the controller MIDI TouchBar on previous MacBook Pros or the UC-33 MIDI controller, put the files available within the folder
Modality_desc_to_add
in the folderMKtlDescriptions
within the Quark Modality.), - (KMeans),
- PopUpTreeMenu (for GUI),
- (redSampler: not necessary; I only use it to play specific sound files.),
- SC-Grids (to play with a topographic drum sequencer from the Eurorack module Grids ported first in C++ by Mutable Instruments and then to SuperCollider by Dennis Scheiba; if you cannot find it in the Quarks directory, you can install it by evaluating the following code:
Quarks.install("https://github.com/capital-G/sc-grids");
), - SpeakersCorner (for GUI),
- TabbedView (deprecated, but necessary for GUI wslib's MasterEQ),
- TabbedView2 (for GUI),
- TabbedView2_QT (for GUI),
- Twister (you have to install it, even if you don't have a MIDI Fighter Twister controller; since it is currently not available in the Quarks directory, you can install it by evaluating the following code:
Quarks.install("https://github.com/scztt/Twister.quark");
), - Unit-Lib (for the 2D trajectory editor),
- WFSCollider-Class-Library (for the 2D trajectory editor),
- WarpExt (for warp synths),
- WindowHandleView (for GUI),
- ZArchive (for saving and recalling presets).
-
-
Put the folder
L4L_ExtensionsLang
, which is within this project folder, into yourExtensions
folder (if you do not already have these classes) and recompile, by going into the menuLanguage -> Recompile Class Library
. -
Install ATK dependencies, by evaluating the code in SuperCollider editor. First, you have to evaluate the following code to open ATK folder:
Atk.openUserSupportDir;
and if the ATK folder is not present, you can create it by evaluating the following code:Atk.createUserSupportDir;
. If the automatic method by evaluating the code of ATK dependencies does not work, download and put the Kernels and Matrices in the ATK folder. -
(Optional for synthesis with some Mutable Instruments modules, like Plaits or Braids) Put in your SuperCollider Extensions folder a compiled version of some plugins, in this case Mi UGens, available in their releases (For Mac Intel architectures, you have to choose v.0.0.3; and Mac users have to download with a browser other than Safari.). To go quicker, you can download a compiled version for your OS available within the folder
L4L_ExtensionsUGens
of this project via browsers like Chrome or Safari.⚠️ For new Mac architectures with arm64, you will have to compile Mi UGens yourself with the help of this repository from Mads Kjeldgaard or by following building commands of Mi UGens repository, which requires to install git, in order to be allowed to load unsigned plugins on macOS. -
For some spatial configurations (quad, 8, and UQAM-32 loudspeaker setup), if you want to skip the step for creating synthDefs in one of the setup steps (2.i) below, to go quicker, I have already prepared in the folder
L4L_SynthDefs
two versions of the 6 folders (for each of the 3 spatial configurations and for each of both servers) you can copy and put into your user support directory, which is accessible via the menuFile -> Open user support directory
. Depending on whether you installed the Mi UGens within yourExtensions
folder (previous step 7 of the installation) and succeeded to boot the audio server correctly without errors, choose the versions including or without these UGens. Recompile again.
The code does not take the form of a SuperCollider quark (i.e. external library) or classes, since I would have been unable to build this tool if I had to recompile the programme each time I had to change the code. Due to this experimental nature based on trial and error, it consists of environment variables collecting arrays, dictionaries and functions mainly spread in three big files within the folder L4L_Project
: _1_Init_BuffersSynths_132.scd
, _2_Init_GUI_225.scd
, _3_Init_Pattern_185.scd
.
In order to launch the tool, open the file _0A_Init_Live4Life.scd
in the folder L4L_Project
and follow the steps below:
-
evaluate the line 2, which defines default configuration parameters, included in the file _0B_Init_Config.scd:
-
First, before evaluating this line, you mainly have to
⚠️ choose the absolute path of your sound folder. To speed up setup and creation process, I have already prepared a structured sound folder to download — to expand and improve — including drum machine sounds specifically sorted for this tool. If you want to quickly test or for Windows and Linux users, you can download this light-weight prepared folder. To know exactly the path you have to change, you can drag the folder namedSoundFolder
in a blank line into your SuperCollider IDE. -
Secondly, you have to choose your spatial configuration and distribution of loudspeakers (stereo, quad, circles of 5, 7, 8 loudspeakers, or domes of 16, 24, 32 loudspeakers) or define it by code with
~numChannelsConfig
in the files_1_Init_BuffersSynths_132.scd
and_2_Init_GUI_225.scd
if not available.
-
-
evaluate the line 9 (to load one server) and the line 13 (to load a second server) (
⚠️ Each step of loading a server may take a few minutes depending on your computer performance. I recommend to avoid doing a lot of things with your computer during this process, since the loading of a server can be heavy. Each server is attributed to one CPU core, so by having two severs, you can have twice as much CPU power.) for:-
initialising a collection of thousands of synthDefs, with a few dozen synthesis types for each envelope type and for each spatial algorithm and a library of trajectories for some algorithms (For each new specific spatial configuration, two folders of synthDefs are created the first time in SuperCollider user support directory for each audio server; the next times, scsyndef files will be more quickly loaded.
⚠️ If you get an error — which might happen more or less given your computer and spatial setup — the first time when you create and build synthDefs, delete the folders of synthDefs for each of the 2 servers created, reboot the server and start from step 1 again.), -
initialising a collection of thousands of mono and stereo buffers of max. 2 GB, hierarchically organised by category in dozens of folders (To play easily with sound files, prepare one folder gathering a collection of subfolders labelled e.g. like:
DL 1Kick
,DM 2Snare
,DH 3Hat
,EL Earth
,EM Water
EH Fire
,IL Bass
,IM Gong
,IH Piano
..., containing dozens of sound files. As explained in the readme of the sound folder project, the first two letters allow to gather together the categories of folders for each of the letter, e.g. the first letterD
for Drums,E
for sounds of the elements,I
for instruments, and the second letterL
,M
, orH
for e.g. a specific color or register. This process may take a few minutes depending on the size of the sound library and your computer.⚠️ If you get an error when you load sound files, you can set the variable~buffersLoading
at "slow" in the file_0B_Init_Config.scd
, then reboot the server and start from step 1 again.),
-
-
evaluate the line 28 (this step may take a dozen seconds), which:
-
opens a GUI with different widows including tabs in the main window like the Sequence view for the composition or the Global view for the performance (see figures above), as well as tab views for global multichannel and ambisonic effects, and
-
evaluates a pattern function, that triggers sound events with sequences of parameters for each track, and a routine updating the GUI (the core of the programme that links the control of synths with the GUI).
-
You can now play the first track by clicking on the green button in the control panel at the left of the window, and experiment by changing parameter modules (see ICMC 2021 paper). To configure the keyboard or other MIDI or OSC controllers, check the corresponding files in the folder L4L_Controllers for full performance control.
A wiki (in construction), including personal thoughts, tutorials and code examples about this project, e.g. to automate some utility functions and load presets to have an improvisation session will be completed and detailed over time.
Live 4 Life has been presented several times during conferences (JIM 2017, ICMC 2018, ICMC 2021, ISEA 2023, ICMC 2024), concerts (ICMC 2018, JIM 2019 - video, Gala Hexagram 2023, ICMC 2024), festivals (Ultrasons from 2016 to 2021, Akousma 2021 - video, Cube Fest 2022), or in the Journal of Music and Technology Organised Sound (April 2021).
Most of the papers and articles are referenced on ORCID or Google Scholar and all are publicly available on ResearchGate, as well as my doctoral thesis (in french, sorry) funded by the FRQSC, about spatial improvisations based on polyrhythms and sequences of parameters in loop via this tool.
Several performances are available either on YouTube or Vimeo.
Several training workshops in immersive sound and spatial improvisation with this open source project have been organised since 2023:
- at CIRMMT in Montréal in March, April and November 2023 in quadriphony,
- at Hexagram in Montréal on May 13 and December 3, 2023, and August 3, 2024, on a 32-loudspeaker dome, as well as the first performative participative installations on May 12 and 14 2023,
- at ISEA (International Symposium on Electronic Arts) in Paris in May 2023 in quadriphony.
I have a lot of features I would like to improve or develop, such as the collection of synthDefs (with e.g. plugins from Mads Kjeldgaard), as well as HOA (currently only FOA).
If you would like to contribute, please get in touch with me in order to organise further development. The code management or installation process can be greatly improved, but for now I prefer to focus on rhythmic music features and performance, particularly by controlling lines of code with Tidal Cycles (code to be released soon hopefully with and separately of this project).
Feel free to post an issue, and you can also send me a mail first since it might be a functionality not explained (a more complete wiki will come), or try to send a pull request, since depending on my priorities I may not have not the time or the ability to solve it.
I am still looking for audiovisual collaborations for the creation of spatial improvisations in a multi-sensory context, combining and alternating music, video / light and dance in space, mainly with:
- audiovisual developers to create the visual from the sound data generated. Technically, the audiovisual object mapping could be developed through open source tools, such as Processing (see the code and how to use it in the wiki), Open Frameworks, Hydra, or even through commercial tools like Touch Designer (basic patch soon to be released - time is flexible!) and Resolume, as long as the creation process and the code are published on Github and open to everyone.
- dancers / movement artists (e.g. circus) for improvising to finally mix the impact between the gesture of the performer, the dancing bodies and the video / light environment. I am particularly interested in all mixing forms of dances from traditional dances to contemporary forms.
The themes of these spatial improvisations are about, but not restricted to: loneliness and love, Bernard Parmegiani and Francis Dhomont, free party, (hardcore) techno music and tradition. Feel free to contact me if you are interested.
You can also support thanks to donations via Ko-fi or get specific support and courses via Patreon. 😁
Live 4 Life grew little by little, by gluing and restructuring a lot of code from others and integrating several systems and quarks. I would have been unable to build this tool without the help of the SuperCollider online community, who always answered my questions and even provided me with some examples of codes and classes.
So, Big Thanks to (including previous and current developers):
James Harkins, Daniel Mayer, Fredrik Olofsson, Julian Rohrhuber, Josh Parmenter, Wouter Snoei, Nick Collins, Jakob Leben, Chris Sattinger, Dan Stowell, Scott Wilson, Joseph Anderson, Miguel Negrão, Scott Carver, Alberto de Campo, Marije Baalman, Brian Heim, Nathan Ho, Patrick Dupuis, Marcin Pączkowski ...
The list might be long. Sorry for those I forgot to mention. I cannot quote all of them.
By giving this tool, it is my way to contribute to SuperCollider for a free world. And I encourage anyone (DSP developers or any user) to support and use this beautiful environment.
© 2011-2077 ∞ Christophe Lengelé
Live 4 Life is an open source software: you can redistribute it and/or modify it under the terms of Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (CC BY-NC-SA 4.0).
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY.
I wish it would be used in the spirit of Free Party. Unfortunately, Free does not mean free in this commercial world, but invites to contribute to the costs and labor according to one's ability to give. I do not want this tool to be used, by any means, for personal profit.
See the License for more details.