Sound Sample from Voice Lessons

Voice Lessons is an interactive touch-screen installation that I recently presented during a graduate critique seminar at MCAD. The piece, developed in Max/MSP, granulates both sound and video as the viewer touches the screen while maintaining synchronization. I will be installing the piece again next month for an open studio night on December 9, 2011 at the MCAD Whittier Studios, 2835 Harriet Avenue South, Minneapolis. I will also be sharing more complete documentation about the piece soon. For now, please enjoy this short segment of audio sampled from the piece as it was in use.

Voice Lessons Excerpt

Grain Machine Update and Layered Experiment

Here’s a new look at the Grain Machine M4L device. Since last time I have updated the device to allow drag and drop samples that are stored with the Live set, and added a visual for the filter that’s controlled by the accelerometer on the controller (iPad, iPhone, or iPod Touch).

The best thing about using this in Live is being able to live-loop and layer the output from Grain Machine into clips on different tracks, not to mention processing. Another advantage is saving the state of the device in the Live set so that one document has sample set X, while the next has sample set Y. Here’s a piece I created with the Grain Machine in Ableton Live using some samples I randomly selected from my sound library.

Grain Machine Layers

Monophonic Step Sequencer Max for Live Device Download

I have had a few requests to share this device so here’s a download link. Please enjoy it, and if you make some interesting music with it, please post a link in the comments. Also, if you’re a M4L developer and make some improvements or enhancements to the device we’d love to see what you do. Thanks!

http://code.google.com/p/m4l-monophonic-step-sequencer/downloads/list

Grain Machine Max for Live Instrument

The Grain Machine v0.1

Something I have been meaning to do for a while was convert the MaxMSP instrument that I titled the Wavetable Glitch Machine (WTGM) into a Max for Live patch. The WTGM uses a TouchOSC interface running on an iPhone, iPod Touch, or iPad to explore samples using granular techniques as well as a virtual scrub dial with friction modeling. Visit the WTGM tag to read more and view a video of it in operation. I have renamed the instrument Grain Machine for the M4L version.

First I prepared the patch for transfer to M4L. This involved making sure that all of the interface objects were in the main patching window, reorganizing the sub-patchers, and cleaning up a variety of other things that I imagined might interfere with the process. Following that, all that was left was copying and pasting the patch into a Max Instrument, replacing some of the standard Max objects with M4L objects, and building a tidy little presentation mode.

Although I had to rework some of the logic and patch cords, the conversion went surprisingly fast. I expected to be working on this for weeks, but it only took me a matter of hours to get it into working order. There is still some fine tuning to be done, but all the necessary functionality is in place. Here’s an audio example I made with a simple breakbeat loaded into the Grain Machine.

Grain Machine Experiment

Ever Wonder What it Would be Like to Draw Sound?

I’m working on a MaxMSP performance patch that uses a Wacom tablet to draw light onto dancers holding light sensitive instruments. Last night we decided to apply sound to the strokes to give the illustrator another way to interact in the piece. Currently the pressure from the pen is translated into the volume and the velocity is translated into pitch. It will need some fine tuning, but I think you can get the idea from the video.