GMS Video Experiment with Reason

GMS Video Experiment w/ Reason from Dane Messall on Vimeo.

My student, Dane Messall, has been experimenting with the GMS over the break and just posted this video experiment. He imported the video into the GMS and then interfaced it with Reason’s Thor synthesizer to generate the sound. Nice one, Dane!

GMS Beta Release, September 28, 2009

gms_grabI have decided on September 28, 2009 as the release date of the GMS, coinciding with my performance with Graham O’Brien, and my birthday. We can call it a combination Ostraka with Dial System and DJ Zenrock show, GMS beta software release, and birthday party. Currently I am releasing a binary version of the software for Mac OS X only. A windows version is in the works, with no prediction of when it’ll arrive.

The GMS is a Gestural Music Sequencer that I developed in Processing (processing.org). The application samples video and displays it either normally or inverted so it looks as though you’re looking into a mirror. Each frame is analyzed for brightness, then the X and Y data of the brightest pixel is converted into a MIDI note. These notes produce a non-repeating sequence based on movement within the range of the capture device.

For more details, including audio examples and video produced using the GMS, checkout the
GMS category. There’s also a Gestural Music Sequencer Documentary Short produced by Josh Clos, that does a good job of illustrating what the software does.

Getting Reacquainted with the GMS

triquetra_rehearseI have had a number of problems migrating my Processing.org application known as the GMS from my old MacBook Pro running OS X 10.4.11 to my new one running 10.5.7. The first issue was alack of support for the IAC drivers in Java, next were some funky conflicts with the Java class I was using to load and save presets. With the help of Grant Muller I have solved these problems.

While I was at it I took on a couple of other issues as well in preparation for my performance on Sunday. All that’s left is for me to become reacquainted with the application in tandem with Ableton Suite 8.

This is actually the hard part because I have use the application effectively as an instrument and real-time visualizer. One problem is running out of hands and fingers while producing the visuals and capturing the MIDI clips. Another danger is the music becoming repetitive while making adjustments. Here’s a quick recording I made while practicing.

Reacquainting Myself with the GMS

External MIDI Control via XML

After a few performances live looping with Ableton and the GMS, I have found it cumbersome and frustrating to have to repeatedly swap between the two applications. To solve this, I have added he ability to control the GMS with an external MIDI device. I achieved this by creating an XML document with the parameters included as tags with a CC attribute to designate what control change value to use for each setting here’s a few lines out of the XML document.

<GMSMidiController>
    <MIDIControllerName>Korg MS2000</MIDIControllerName>
    <MidiChannel CC="31" type="knob" />
    <TopOctaveIncrement CC="82" type="button" />
    <BottomOctaveIncrement CC="78" type="button" />
    <Preset useProgramChange="true" />
    <ToggleFreeMode CC="77" type="button" />
    <StartStop CC="89" type="button" />
    <Sustain CC="64" invert="false" type="pedal" />
    <SetDuration CC="30" type="knob" />
    <ToggleDotted CC="92" type="button" />
    <SetScale CC="29" type="knob" />
    <ToggleMirroring CC="86" type="button" />
</GMSMidiController>

As you can see I’m using knobs to adjust some settings and buttons to adjust others. It’s really fun to turn a knob on my Korg MS2000 and see the sliders in my software start to move in response. Program change for presets and note on for transposition will work from any old controller, but the rest of the parameters need to be mapped to knobs, sliders or buttons. In total I have around thirty-six specific parameters that are now adjustable with a controller.

GMS Performance Excerpt #4

GMS Performance in Downtown Minneapolis

gms_snapshotI’ve been scheduled to perform live using my GMS this Wednesday night, May 13, 2009. I’ll be projecting against the Western wall of Art Institutes International. The reactive music will be amplified along with the projection as it is produced in real-time. Here’s the publicity statement that went out about the event.

John Keston will be performing using his gestural music sequencer or GMS on Wednesday, May 13, 2009 in the parking lot next to Art Institutes International Minnesota, 15 South 9th Street, Minneapolis, Minnesota. The GMS was written in Processing.org by Ai instructor, Unearthed Music recording artist, and AudioCookbook.org founder John Keston. His tool analyzes video input and converts it into a sequence of musical information in real-time. The live video image will be projected on the building while the musical response to the images is amplified through a sound system. For more information about the GMS visit audiocookbook.org/tag/gms/. All Ai students, staff, alumni, and the public are welcome to attend this free performance. A drawing will be held (for WDIM students only) giving away two passes to this years Flashbelt conference.

Here’s a segment from a practice session today to give you an idea about what sort of output the GMS can produce. All of the percussion, melodic lines, and bass were generated by the sequencer, then live looped to produce the results.

GMS Practice Piece in C Sharp