Interactivity Sonified Workshop at INST-INT

The INST-INT 2015 conference, exploring the “…art of Interactivity for objects, environment, and experiences,” just happened and I had the honor and privilege of giving a workshop at the event titled Interactivity Sonified. The intent of the workshop was to teach attendees to sonify their work by triggering, generating, and processing sonic textures or musical forms through interactivity. I covered several basic programming techniques for including sound in projects with input devices and numerical values. Touchscreens, microphones, cameras, gyros, MIDI controllers, or any other stream or set of incoming data might be used to add sound. The sonification of this information adds a whole new sensory dimension to interactive installations and performances.

During the workshop I covered sonification examples in and Max while looking at input signals from the Leap Motion, MIDI controllers, video camera, microphone, keyboard, and trackpad. We experimented with recording, looping, reversing, pitch shifting, and granulating sampled audio. We also looked at modeling waveforms and processing them through lowpass, highpass, bandpass filters, delays, and reverbs. Finally we looked at the convolution reverb in Max for Live trying out several of the IRs as well as discussing the technique of sampling impulse responses.

In this video I asked the attendees to pull out their headphones cords after completing the task of triggering sounds with a moving object. The resulting cacophony in the room was quite beautiful! I thoroughly enjoyed giving this workshop and would love to do it again. Please be in touch if you’re part of an organization interested in a workshop like this. For more information you can view the slideshow for the workshop at Keep in mind that the slide show just a fraction of the activities. Most of the time was spent applying code and examples to either trigger, generate, or process sound.