GMS Interface Design Using controlP5

I spent most of my week long break from teaching continuing development of my Gestural Music Sequencer. I’m not sure if I should call it a sequencer or an arpeggiator. It’s really more like an instrument than either of those. The Gestural Musical Instrument perhaps?

Anyway, it’s far from complete, but I added the ability to toggle sustain on the notes as well a menu to choose from available MIDI device drivers. I decided to use a library for Processing called controlP5 to build the UI controls as shown in the screen grab to the right. All of the controls allow keyboard input, so the application can function while the interface is hidden, only displaying the video.

I’m also planning on adding a function to drop video files into the application to create musical phrases from pre-recorded video pieces. Here’s a section of audio captured from the GMS while attached to the Java Sound Synthesizer Sun Microsystems driver. The default sound for this device is an acoustic piano. You can hear the sustain stop around fifteen seconds in then come back on at the end.

GMS Piano Arpeggio

 

This entry was posted in Audio News, GMS, One Sound Every Day, Sound Design and tagged , , by John CS Keston. Bookmark the permalink.

About John CS Keston

John CS Keston is an award winning transdisciplinary artist reimagining how music, video art, and computer science intersect. His work both questions and embraces his backgrounds in music technology, software development, and improvisation leading him toward unconventional compositions that convey a spirit of discovery and exploration through the use of graphic scores, chance and generative techniques, analog and digital synthesis, experimental sound design, signal processing, and acoustic piano. Performers are empowered to use their phonomnesis, or sonic imaginations, while contributing to his collaborative work. Originally from the United Kingdom, John currently resides in Minneapolis, Minnesota where he is a professor of Digital Media Arts at the University of St Thomas. He founded the sound design resource, AudioCookbook.org, where you will find articles and documentation about his projects and research. John has spoken, performed, or exhibited original work at New Interfaces for Musical Expression (NIME 2022), the International Computer Music Conference (ICMC 2022), the International Digital Media Arts Conference (iDMAa 2022), International Sound in Science Technology and the Arts (ISSTA 2017-2019), Northern Spark (2011-2017), the Weisman Art Museum, the Montreal Jazz Festival, the Walker Art Center, the Minnesota Institute of Art, the Eyeo Festival, INST-INT, Echofluxx (Prague), and Moogfest. He produced and performed in the piece Instant Cinema: Teleportation Platform X, a featured project at Northern Spark 2013. He composed and performed the music for In Habit: Life in Patterns (2012) and Words to Dead Lips (2011) in collaboration with the dance company Aniccha Arts. In 2017 he was commissioned by the Walker Art Center to compose music for former Merce Cunningham dancers during the Common Time performance series. His music appears in The Jeffrey Dahmer Files (2012) and he composed the music for the short Familiar Pavement (2015). He has appeared on more than a dozen albums including two solo albums on UnearthedMusic.com.

One thought on “GMS Interface Design Using controlP5

Leave a Reply