I used Ableton Live to produce in real-time and my wavetable glitch machine Max patch to make most of the noises, which I routed into Live using Soundflower.
This five year old set is one of the very first things I ever posted on SoundCloud and it’s 86 minutes from a live solo performance with Minneapolis Art on Wheels. Checkout the original posts here:
As I’ve mentioned in some previous articles, I have been working on a multi-media dance collaboration, Words to Dead Lips, with Annichia Arts since last December that has finally come to a close. We staged three performances at Intermedia Arts in Minneapolis this weekend to a mostly full house. My part in the collaborative effort was to produce the music, and I was given an open mandate to do so. As is my preference, I opted to perform the work to the dance and projected imagery, rather than submit pre-recorded material. Although I adhered to an agreed framework for the soundscape, the improvisational nature of this approach made every performance unique.
Another component to the sonic environment was the noise shield. This device, that I built into saucer sleds, was used by the dancers to synthesize sounds using body contacts and a light dependent resistor. Here’s a five minute excerpt of audio from the closing night’s performance.
Here’s a new look at the Grain MachineM4L device. Since last time I have updated the device to allow drag and drop samples that are stored with the Live set, and added a visual for the filter that’s controlled by the accelerometer on the controller (iPad, iPhone, or iPod Touch).
The best thing about using this in Live is being able to live-loop and layer the output from Grain Machine into clips on different tracks, not to mention processing. Another advantage is saving the state of the device in the Live set so that one document has sample set X, while the next has sample set Y. Here’s a piece I created with the Grain Machine in Ableton Live using some samples I randomly selected from my sound library.
Something I have been meaning to do for a while was convert the MaxMSP instrument that I titled the Wavetable Glitch Machine (WTGM) into a Max for Live patch. The WTGM uses a TouchOSC interface running on an iPhone, iPod Touch, or iPad to explore samples using granular techniques as well as a virtual scrub dial with friction modeling. Visit the WTGM tag to read more and view a video of it in operation. I have renamed the instrument Grain Machine for the M4L version.
First I prepared the patch for transfer to M4L. This involved making sure that all of the interface objects were in the main patching window, reorganizing the sub-patchers, and cleaning up a variety of other things that I imagined might interfere with the process. Following that, all that was left was copying and pasting the patch into a Max Instrument, replacing some of the standard Max objects with M4L objects, and building a tidy little presentation mode.
Although I had to rework some of the logic and patch cords, the conversion went surprisingly fast. I expected to be working on this for weeks, but it only took me a matter of hours to get it into working order. There is still some fine tuning to be done, but all the necessary functionality is in place. Here’s an audio example I made with a simple breakbeat loaded into the Grain Machine.
This Friday, April 23, 2010 at around midnight I am very excited to be performing a rare solo set at McNally Smith under my Ostraka moniker.
I’ll be using a number of custom developed tools, including the GMS and my tentatively titled WTGM (Wave-Table Glitch Machine).
The event is called Sound Crawl and is being billed as “the official sound track for Art Crawl”.
Other artists include James Patrick and Timefog, Oliver Grudem, and Minneapolis Art on Wheels. More information and a complete schedule is available at: