Physically Modeling Multitouch Controls

spinnerFor the last two weeks I have been working on a performance application that I’m developing in MaxMSP controlled with TouchOSC on the iPhone or iPod Touch. The application is coming along quite well. I have the granular traversal piece working how I want, as I described in Traversing Samples with Granular Synthesis.

Now I’m working on another feature of the application designed to allow the user to play samples with a rotary dial; not unlike manually spinning a record on a turntable. The basics of getting this going were pretty simple, but I also wanted to be able to spin the dial and have it continue to rotate based on the acceleration applied. Secondly, I wanted to have a slider that would adjust the amount of friction, from frictionless to instant braking.

This essentially involved physically modeling the control to behave like a turntable or other spinning device. After trying four or five techniques using standard Max objects I managed to get it working, but it wasn’t pretty. Instead I decided to try using a few lines of Javascript to do the calculations and adjust the position of the dial. This worked much better and only required about 35 lines of code. The best way to illustrate this application will be with video. I’ll shoot a few minutes to get the point across and share it here soon. For now here’s a recording made with the modeled controller I described and just a small amount of friction.

Percussion Loop Spinning

This entry was posted in One Max Patch Per Week, Performance, Sound Design and tagged , , , by John CS Keston. Bookmark the permalink.

About John CS Keston

John CS Keston is an award winning transdisciplinary artist reimagining how music, video art, and computer science intersect. His work both questions and embraces his backgrounds in music technology, software development, and improvisation leading him toward unconventional compositions that convey a spirit of discovery and exploration through the use of graphic scores, chance and generative techniques, analog and digital synthesis, experimental sound design, signal processing, and acoustic piano. Performers are empowered to use their phonomnesis, or sonic imaginations, while contributing to his collaborative work. Originally from the United Kingdom, John currently resides in Minneapolis, Minnesota where he is a professor of Digital Media Arts at the University of St Thomas. He founded the sound design resource, AudioCookbook.org, where you will find articles and documentation about his projects and research. John has spoken, performed, or exhibited original work at New Interfaces for Musical Expression (NIME 2022), the International Computer Music Conference (ICMC 2022), the International Digital Media Arts Conference (iDMAa 2022), International Sound in Science Technology and the Arts (ISSTA 2017-2019), Northern Spark (2011-2017), the Weisman Art Museum, the Montreal Jazz Festival, the Walker Art Center, the Minnesota Institute of Art, the Eyeo Festival, INST-INT, Echofluxx (Prague), and Moogfest. He produced and performed in the piece Instant Cinema: Teleportation Platform X, a featured project at Northern Spark 2013. He composed and performed the music for In Habit: Life in Patterns (2012) and Words to Dead Lips (2011) in collaboration with the dance company Aniccha Arts. In 2017 he was commissioned by the Walker Art Center to compose music for former Merce Cunningham dancers during the Common Time performance series. His music appears in The Jeffrey Dahmer Files (2012) and he composed the music for the short Familiar Pavement (2015). He has appeared on more than a dozen albums including two solo albums on UnearthedMusic.com.

2 thoughts on “Physically Modeling Multitouch Controls

  1. Pingback: Audio Cookbook » Blog Archive » Five Output Atemporal Looper

  2. Pingback: Audio Cookbook » Blog Archive » Multitouch Rotary Dial and X-Y Granular Exploration

Leave a Reply