Skip to content

the RADIUS of fourstones

In RADiUS, Perri Lynch and I massage field recordings into electronic collages, live. To accomplish the live resampling part of this process, I use Ableton Live. It has a wonderfully simple user interface that is geared towards performing musicians who use samples, rather than towards engineers who think that forcing musicians to think like engineers is the way to make music. (Not that engineering isn’t important! It just isn’t the correct way to approach the right-brain activity of live performance and improvisation.) Ableton has been my favorite piece of software for last 3 years running, despite the appearance of many other interesting software works such as the massively refurbished Max/MSP, Processing, the horribly flawed but nontheless interesting ChucK, and other worthy digital media thingies that I will someday post about.

Victor Stone sits in with RADIUS

Victor Stone sits in with RADIUS

This week, my good friend Victor Stone, aka fourstones, was visiting Seattle. Back in the day, Victor taught me why samples are musically interesting and how to work with them, and he continues to preach this message as the man behind the excellent Creative Commons remix site ccMixter. He is also musically fearless and a big Ableton fan, and so we decided to plug him into the board at RADIUS world headquarters. It was a lot of fun, and Victor contributed a whole different layer – the photodoc is attached, and I grabbed a recording off of the board for later.

The trouble started when we tried to go to a less ambient and more beat-driven jam. I had neglected to bring an extra midi interface so that Victor’s laptop could slave to mine, or vice versa. What we were proposing to do was not rocket science: “let’s all play at 82 bpm,” says Victor. A musically simple idea, and one that 3 acoustic players would intuitively do without having to mention tempo or to speak about anything. But we knew right away that we were screwed – the current state of the art in laptop synchronization is to use one of several similar synchronization protocols, and if you don’t have the network connection, it is impossible to get your downbeats together, much less to change tempo together or to follow an acoustic musician’s lead. Pretty lame, eh?

I’ve seen a number of experimental DIY ways to synchronize laptop music with the real world, and I’ve used both transient detection and video tracking for this purpose myself, both of which sort of worked. The current state of the art in the commercial packages such as Ableton is “tap time, nudge the tempo, and pray,” which doesn’t work at all. But given that Ableton already has fantastic support for beat morphing, resampling, and recording external input, why not sniff that external input for a beat?  Ableton designers, take note: it would be a very cool feature indeed to allow synchronization with human musicians!