Wall of Synth with Juno-106, Volca Keys, Tempest, and Bass Station II

This is one of the first recordings that I made after putting the final touches on my recent home studio remodel. I went from dark wood panelling and old carpeting to cork flooring and drywall with fresh white paint and acoustic panels. The room sounds better, looks better and brighter and feels like a proper studio. To finish things off I installed a five tier shelving system to house a wall of my favorite synthesizers.

synthwall

For this track I wanted to try using the Korg Volca Keys as a sequencer for my Juno-106. “How is that possible?!?,” you might ask. Well I’ll tell you how. I have modded the Volca Keys to include a MIDI out port. Using this mod I sent the MIDI out from the Volca Keys to the Juno-106. Both synths played the same sequence, but because the Volca Keys is polyphonic the Juno-106 consistently played chords, while I switched modes on the Volca Keys between poly, unison, and fifths, etc. I fleshed out the piece by adding an arp provided by the Bass Station II and a bass line from the Tempest.

Multitouch on a Mac without a Touchscreen

fingerpinger

As you may have noticed it’s been a little quiet around here lately. The reason for this is that since the end of 2013 I have been keeping myself busy with a studio remodel (more on that later) followed by concentrating on preparations for a performance (I’m pleased to announce) at Echofluxx 14 in Prague, May 2014. Here’s a quick note about Echofluxx 13 from their site:

The Echofluxx 13 is a festival of new media, visual art, and experimental music produced by Efemera of Prague with Anja Kaufmann and Dan Senn, co-directors. In cooperation with Sylva Smejkalová and early reflections, the Prague music school (HAMU), and Academy of Fine Arts (AVU), Echofluxx 13 will present international and Czech presenters in a five-day festival at the Tracfačka Arena in Prague, Kurta Konráda 1, Prague 9, May 7-11, 2013. For more information contact: info@echofluxx.org

I’ll discuss more details about this upcoming performance in another post. For now I would like to bring attention to the possibility of using the Mac trackpad and/or Apple’s Magic Trackpad for multitouch. My performance at Echofluxx involves using custom built software to loop granular audiovisual media. This idea evolved from projects in the past that used a 32″ touchscreen. This time the media will be projected, so I naturally decided to use the iPad as the controller. I built the project using Cycling ’74 Max and MIRA, which was very convenient, but I couldn’t get over the latency using the iPad over WiFi for multitouch controls.

I decided that the most convenient alternative would be to use the trackpad on the Mac laptop. Max has an object called “mousestate” that polls button-status and cursor-position information from the default pointer device. However, it is not designed to take advantage of multitouch data. This is where Fingerpinger comes in. Fingerpinger was able to detect ten independent touch points (perhaps more but I was out of fingers) on the built in trackpad on my MacBook Pro. Which begs the question; how did I take that screenshot?

Ten touchpoints on such a small surface is probably impractical, but I only need two; one for X-Y data and a second one for volume. Most importantly I wanted the audiovisual content to be activated simply by touching the trackpad rather than having to click or hold down a key. Fortunately Fingerpinger has a state value for each touchpoint that I was able to use to activate on touch and release. The latency is hardly noticeable compared to an iPad over WiFi, and I have also simplified my setup meaning I can travel with less equipment and rely on fewer technologies. I still like the idea of using an iPad for multitouch controls, mostly because of the opportunities for visual feedback. But for this particular application Fingerpinger is a great solution.

Music with Context: Audiovisual Scores for Improvising Musicians

Thesis Venn Diagram 2.0 - New Page

Last May I completed my MFA in New Media at the Minneapolis College of Art and Design. A good stretch of my time at the college was spent working on my master’s thesis. Here’s the abstract:

This paper explores the idea of mutable, audiovisual scores for improvised musical performances through the description of personal perspectives, practical examples, proposed projects, and research. The author postulates that an audiovisual score can be a useful tool to connect improvising musicians to each other and their audience through the insertion of a mediating audiovisual layer within the work. These systems are used as a primary influential agent for an ensemble of improvisers, providing them with a context for a musical conversation. In contrast to traditional notation and graphic scores, audiovisual scores embrace the chaotic ambiguities of environmental influences giving the music the context of unpredictable everyday events. Presenting an unpredictable audiovisual score parallels the indeterminate improvisation of the ensemble. It activates the last vestige of what remains immutable within traditional forms of notation driven performance inserting it into a mutable layer within the work.

Recently it occurred to me that many AudioCookbook readers will find the subject matter in my thesis interesting. There are detailed, conceptual explanations for many of the projects that I have shared here over the last few years. There are also references to work by many other artists who have provided inspiration to me. If you’re interested please click the link below to view or download the document.

Music with Context: Audiovisual Scores for Improvising Musicians by John Keston

iPad only Track with Samplr, iSEM, Figure, QuNexus

771a46346fa411e3816f0eceecffb26d_8

I started this track on the plane during a recent trip to Seattle, Washington. Being away from my studio for a few days, I brought along my iPad and QuNexus to produce a few tracks while traveling. I decided to use Samplr as the foundation for this piece because I enjoy its simplicity and intuitive interface. It also offers a unique way to work in contrast to mobile DAWs that are usually analogues of their PC based counterparts.

I started out by experimenting with the kora samples included with one of the Samplr demos, arpeggiating and processing them in different ways. The rest of the sounds were produced with Propellerhead Figure and iSEM from Arturia that I played with the QuNexus. Once I had the tracks organized I performed the mix in Samplr while capturing the results in AudioShare.

Rule Based Electronic Music: Good Morning Mr. Paik

Screen Shot 2013-12-19 at 11.25.48 PM

For the sixth piece in the series I left out the percussive layers. This was also a test for the MIDI output mod that I recently applied to the Volca Keys. The setup included the Volca Keys MIDI output (mod) going to the MIDI input of the Novation Bass Station II. To sync the Monotribe I ran the sync out from the Volca Keys to the Monotribe. This can be problematic because there’s some crosstalk between the audio and sync out ports on the Volcas. Keeping the level of the Volca Keys just under full prevents stuttering on the Monotribe, but I’d like to find a better solution.

I also followed these rules: 1) No overdubbing. All tracks were recorded at the same time. 2) No computer sequencing. All sequencing was on the instruments used. 3) No looping or shuffling parts in post. Editing for length and content was allowed. 4) One reverb send, one delay, and fades were allowed. No other processing. 5) No mix tricks in post. Reverse and rolls were performed live.