I wanted to be able to create fast successions of very minimalist but precise shapes with the lasers and I also had a desire to create shapes that make use of random functions, aleatoric elements, complex movements. A precise sync with musical events and visual elements was desired as well as the option to treat sound and image independently. Since the framework was meant to provide a useful interface for concerts, the ability to interact with the structure and the sounds and shapes in real time was also an important development goal.
The lasers are controlled with analog signals, created in MaxMSP. These signals are technically the same type of signals used for audio, but they serve a different purpose and are usually not intended to be made audible. They rather control the movement of extremely fast and precise mirrors and the intensity of the laser beams and by doing so allow me to draw lines, circles, complex morphing shapes or even text and numbers.
The heart of the project is the "LumiereLaserPatternGenerator" (LPG) software which consists of a series of basic shape generators. They can be seen as shape synthesizers, with parameters that can be set to define their exact behavior. Additionally there are shape modifiers, objects that on a global scale, allow me to do geometric transformations or manipulations of the intensity of the laser beam for additional complexity. A simple example would be the '
zoom' feature that allows me to change the size of a shape. Various aspects of the shape synthesis can be controlled in real time , for instances the speed of movements. This allows to articulate the shapes in a musical fashion.
The LPG also contains all sorts of setup and alignment tools which are essential to combine the three lasers I am using into one coherent image, and adapt to different venues and projection scenarios.
The LPG acts like a synthesizer, where each incoming 'note' contains the complete set of data needed to draw a specific shape. In order to make the programming of shapes easier and faster, it is only necessary to tell the LPG which parameters are not at their default values. Parameters are sent to the LPG via OSC commands and can be very basic for simple shapes. Sending '
laser 1 circle' is all it needs to draw a medium sized circle, whilst '
laser 2 spark .3 .0 2000 text count 7 clock 4 scan 1 0 3 flip 0 0 2 ratio 1 1 .4 .1 20000 .9' does something much more dynamic and complex.
zoom 1000') are captured safely.
The LPC only acts as the visual synthesis engine, it does not store any 'presets', apart from the adjustment and setup info for each laser.
During the performance, the Live set is controlled mainly via three hardware controllers: A Launchpad with a modified 'Session View' script which allows to jump in blocks between the different parts and has different and more dimmed LED colors to work best in the darkness, a combination of Livid Elements modules for control of volume, sends and synthesis parameters and a Doepfer fader box for EQ and effect control. This setup makes it possible to play a highly improvised show without much need for touching the mouse / touchpad of the computer.I wrote several special Max4Live devices for the Lumière project that expand the capabilities of Live to match my desires: MIDI routing devices that send MIDI from several tracks to other tracks, devices that allow to switch between several synthesizers in a Live set, a convolution reverb with switchable impulse responses, another reverb which is a mix of convolution and algorithmic reverb, and the 'laser sound engine' a dedicated synthesizer inspired by the control signals for the lasers.
SoundThe sonic side went through several conceptual iterations: At the beginning I planned to completely decouple the sound and the visual parts and write quite conventional music for the performance. Than I started incorporating the laser control signals as part of the sonic palette but experienced several conceptual and technical difficulties. The sound aesthetics of the laser control signals did not match with the other musical elements and the timing was inacceptable due to the fact that data had to be sent via ethernet to a remote computer. For Lumiere Version 1.5, created in January 2014, I built a dedicated M4L synthesizer that acts very similar to the laser pattern generators but resides inside the audio computer without the need for an external ethernet connection and with the freedom to tweak the sound synthesis independently from improvements of the visual engine whilst still having a strong sense of coherence. As a direct result, I reduced the more common musical elements to a bare minimum, mostly just some sort of musical backbone consisting of a bassdrum and some higher noisy percussions and mainly use the laser sound generator. Textuaral elements are created by granulation and 'freezing' of the laser sound generator output.
The whole setup works now as one consistent audiovisual engine, built for real time interaction based on audiovisual patterns with the capability of subtile to strong transformation of those patterns.