WNYC‘s Radio Lab with Jad Abumrad and Robert Krulwich is one of my favorite radio programs and podcasts of all time. Their latest program explores the world of numbers. According to interviewees on the program all babies think logarithmically until the abstract concept of integers has been systematically drilled into their heads.
It’s a fascinating story, and as usual, permeated with beautifully done sound design. For more on Radiolab checkout “Sound is Kind of Touch at a Distance”, or visit their site for more podcasts of the program.
Recently I read an article in Future Music on the Snyderphonics Manta OSC controller. I’m getting more and more into OSC (Open Sound Control), so this is a really fascinating device that I can see replacing and expanding upon what I’m building for the iPod Touch. The Manta has forty-eight touch sensors on a six by eight pad. Each sensor can handle note on/off and velocity information, which you can’t do on the iPod Touch. It also has two touch sliders and four touch buttons, all assignable via OSC, or MIDI with a free application that’s available on synderphonics.com. The device also accepts input in order to provide feedback via LEDs that back light the controls. I have been researching and experimenting with multitouch devices to do music and sound design for a while now, and the Manta seems to solve a lot of shortcomings of other devices. Congratulations to Jeff Snyder for designing a unique and intriguing instrument.
With help from Josh Clos I have shot a short video documenting what my latest MaxMSP project does.
It’s a sort of swiss army knife of wavetable glitch machine and sample scrubbing tools. Hopefully the video will shed some light on what this project is about. I’ve been trying to describe it in a few other posts without much success, but seeing it in action seems to make a bit more sense.
Here’s a screen grab of a patch I’m working on to successively loop five phrases of sound repetitively. For example, looping another phrase after the fifth time will replace the first and so on. The goal of this patch is to allow me to feed in audio signals from my multi-touch glitch machine into the looper so I can build compositions for a five speaker sound art installation I’m doing at the end of this semester at the University of Minnesota.
For the example I routed outputs 1, 3 and 5 to the left channel and outputs 2 and 4 to the right channel. I also temporarily generated a randomly pitched sinusoid to run into the looper for testing. The large toggle in the upper left initiates the looping and pressing it again stops it. Currently there’s no mechanism to find zero crossings, so the result has lots of clicking in the output. To make good use of the clicks (I’ll be fixing this later) I routed the output into Ableton Live, and loaded on heaping portions of distortion and delay. If life gives you clicks, make click-on-aid.
For the last two weeks I have been working on a performance application that I’m developing in MaxMSP controlled with TouchOSC on the iPhone or iPod Touch. The application is coming along quite well. I have the granular traversal piece working how I want, as I described in Traversing Samples with Granular Synthesis.
Now I’m working on another feature of the application designed to allow the user to play samples with a rotary dial; not unlike manually spinning a record on a turntable. The basics of getting this going were pretty simple, but I also wanted to be able to spin the dial and have it continue to rotate based on the acceleration applied. Secondly, I wanted to have a slider that would adjust the amount of friction, from frictionless to instant braking.
This essentially involved physically modeling the control to behave like a turntable or other spinning device. After trying four or five techniques using standard Max objects I managed to get it working, but it wasn’t pretty. Instead I decided to try using a few lines of Javascript to do the calculations and adjust the position of the dial. This worked much better and only required about 35 lines of code. The best way to illustrate this application will be with video. I’ll shoot a few minutes to get the point across and share it here soon. For now here’s a recording made with the modeled controller I described and just a small amount of friction.