Skip to content

why I am jonesing for M4L

I’m excited about the ambient electronic constructions that I’m currently working on, which combine projective geometry with beautiful field recordings that my friend and collaborator Perri Lynch captured in the Amazon rain forest six weeks ago.

A finite projective plane with 31 points

The finite projective plane PG(2,5)

A finite projective plane with 31 points and 31 lines provides structure for the virtual space that I am creating for the piece. (A mandala-like visualization of this space that I drew using Inkscape should be visible on the left of this post.) Each “point” occupies the tip of one bump on the black ring, while each “line” is a different color. Each line passes through 6 points, and each point has 6 lines passing though it. (For the math geeks, this drawing is an expanded version of the difference set {1, 5, 11, 24, 25, 27}.) The mapping for the piece associates a distinctive sequence of recordings with every point in the space. The spectra and amplitudes of each of these sequences are diffused onto neighboring points using the incidence structure of the underlying geometry.

Projective spaces are famously dual: any theorem stated in terms of the incidence of lines and points can also be stated with the terms “point” and “line” swapped. To emphasize this dual nature as I map musical events onto the space, I am exploiting the two faces of the Fourier transform, a dual space that very familiar to electronic musicians. The Fourier transform ties time, amplitude, phase, and frequency together into a single tidy bundle, using wonderful math. I use it to cross-synthesize and deconstruct the panoramic wide-spectrum jungle landscapes recorded by Perri; the amplitude envelopes from the same sounds are simultaneously used as control signals.

I am currently rendering the piece using a somewhat laborious workflow that involves Ableton Live, Max/MSP, and Logic. I shuttle semi-processed sounds back and forth between the 3 programs using whatever method works: the filesystem, canned plugins, or even rewire when it cooperates. At the same time, the matrix math used for spatialization and signal transformation is done either manually or using a computer algebra package, and then applied manually to the mix.

OK, so what is this M4L thing and why does it relate to this activity? A couple of weeks ago, Ableton and Cycling ’74 (makers of Max/MSP) announced the imminent arrival of a new product named “Max for Live”. If this product functions as hyped, it will greatly improve quality-of-life for us often neglected experimental electronic music folk. For example, I would theoretically be able to automate much of the workflow for the new piece by using Live to trigger and resample sound sequences, while delegating their transformation and spatialization to my own homemade combination of Live racks, Max/MSP patches, and random plugins. I’m sure that sample cutting, final mix, and mastering would still be done in Logic, but the rest of the process, which is the bulk of the work, would become much easier. Cooler still, the “instruments” that I would build to perform this process could then be used over and over again, most notably as part of live expanded performances of this piece or others like it.

The march goes on. What took me months in 1982, laboriously creating soundfiles with a C compiler and hearing them rendered hours later in glorious 12 bit stereo by a dedicated PDP 11, has become a near-realtime programming activity using tools such as Max, SuperCollider and Chuck. To see high-quality mainstream software such as Ableton come to this party is just fantastic. I can’t wait!