Radical Futures Performance Piece: Rhodonea

On Wednesday, May 8, I debuted a performance piece titled Rhodonea at the Radical Futures conference at University College Cork, Ireland. At the concert I had the privilege of sharing the bill with Brian Bridges, Cárthach Ó Nuanáin, Robin Parmar, and Liz Quirke.

Rhodonea is a series of audiovisual etudes performed as a model of how we might collaborate with near future synthetic entities. Software feeds automated, algorithmic, projected visual cues, tempi, and low frequency oscillations to improvising electronic musicians. The compelling visuals, based on Maurer Roses, suggest melodic, harmonic, and percussive gestures that are modulated by data streaming from the generative animations. Throughout the piece artists adapt to the familiar yet unpredictable graphic scores and corresponding signals. The end result is an impression of how humans might interact with AI in a collaborative and experimental way.

I chose to perform Rhodonea as a soloist although it can be performed by an ensemble of up to four musicians. The generative and improvisational aspects mean that every performance is different than the next, but the piece has a consistent signature that leads the music. This includes modulation corresponding to each rhodonea that is translated into MIDI data and fed to parameters that effect the timbre and other aspects of the instruments. I captured the video above the day after the performance, which illustrates the characteristics of the piece, which I developed in Processing.org.

For this international performance I used four instruments inside Ableton Live 12 controlled by an Arturia KeyStep to minimize the gear I needed to travel with. The Ableton instruments I used were Drift, two instances of Meld (a macrosynth new in Live 12), and Collision. In the video below you can see how the generative graphics are manipulating the filter in Drift.

Solo Electroacoustic Piano Teaser

piano_kp3

This teaser for my show tonight at Jazz Central Studios includes a handful of snippets from my rehearsals of the electroacoustic piano pieces I’ll be performing. It’s all acoustic piano with processing that includes delays, live looping, freezing reverb decay, and reverse. The last snippet also includes the use of a felt mute between the hammers and the strings. The tone is much darker, softer, and subdued.

Solo Electroacoustic Piano at Jazz Central Studios

solo_piano_jcs_20161123

I feel excited and privileged to be playing a concert of original piano compositions on November 23, 2016 at Jazz Central Studios in Minneapolis. The compositions include acoustic piano pieces (unaltered by processing or electronics) and a number of electroacoustic piano pieces that will involve manipulating the signal from the piano in real-time. For example I’ll be using an analog delay to create pulsing washes of sound from the piano. There will also be examples of sampling and looping the piano and then manipulating the loops through more processing.

Although ACB readers will be more familiar with my electronic work, acoustic piano is the instrument that has stayed with me since childhood. I even had a weekly jazz piano gig that lasted eleven years! This upcoming solo piano performance will be my first in over a decade and it will be very different for me because I’ll be playing my own compositions instead of a the jazz standards I used to play.

Jazz Central Studios has a really great sounding grand piano in house that I’ll be playing during the show. I have performed at the venue a number of times recently thanks to friend, bassist, and collaborator, Casey O’Brien who is one of the current artistic directors for the non-profit organization. Here’s some more information about JCS:

Jazz Central Studios (JCS) is a tax exempt nonprofit organization committed to strengthening the Twin Cities jazz community by offering a live performance/educational environment that nurtures artistic growth. Our space consists of 1800 square feet which seats up to 50 people. It is complete with grand piano, house drum set PA, and lights.

In 2010, local jazz musicians Mac Santiago and Tanner Taylor established Jazz Central Studios as a rehearsal and recording space for Twin Cities jazz musicians. We encourage jazz patrons and musicians of all levels to become a part of Jazz Central Studios. Whether you want to develop your skills and career as a performing musician or you want to meet other jazz enthusiasts and support the local scene, there is a place for you here.

The music on November 23, 2016 will start at 8:30pm. The suggested donation is $10 for general admission and $5 for students. I hope you’ll join me! Here’s an excerpt from a recording of me (piano, electronics), Cody McKinney (bass, voice, electronics), and Graham O’Brien (drums, electronics) made at JCS by Diego Ramallo. In the recording I’m using a sampler to live-loop piano layers and then run things through delays and other processing.

Interview: Graham’s O’Brien’s Drum Controller Video Series

Drum Controller - Graham O'Brien

Graham O’Brien is an exceptional and inventive drummer, composer, and producer. It has been my privilege to play with him at dozens shows and on at least five separate projects over the last eight years. His latest solo endeavor is a series of five videos titled Drum Controller. Graham had discussed his goals for the project with me, but when I saw/heard the videos I was immediately impressed. I wanted to know more about how he was able to trigger these beautiful and complex electro-acoustic arrangements without touching anything other than his minimal kit of kick, two snares, high hats, and a ride.

Note: Graham will be performing music with his Drum Controller setup and Thomas Nordlund on guitar at Honey in Minneapolis this Sunday, June 5, 2017. Read on for the interrview and a look at his video series. Continue reading

TX81Z Patch Degrader with Interpolation

This quick demo illustrates how TX81Z Patch Degrader is interpolating between previous and newly generated parameter values. TX81Z Patch Degrader is a Max for Live MIDI effect that chips away at patches on the TX81Z by randomly changing (or degrading) parameters at a specified rate. What makes the process interesting is that it is possible to ramp up or down (interpolate) to the new value rather than changing it instantaneously.

To create the Max for Live MIDI instrument I started with TX81Z Editor 1.0 by Jeroen Liebregts who was kind enough to share his work on maxforlive.com. I added in the degradation process features and made some adjustments to the interface to make room for the controls. Once I get things shaped up I’ll be happy to share the patch if anyone is interested.

Screen Shot 2015-08-08 at 5.37.43 PM

The features I added are visible in the second panel of the TX81Z Patch Degrader Max MIDI effect. I’ll describe them from the top down:

  1. Level bypass prevents the operator levels from being included in the degradation process so that the sound doesn’t completely die out.
  2. When the interpolate switch is on new values (as long as they have an adequate range) are ramped up or down to the new value based on the rate.
  3. Loop causes the degradation to continue indefinitely by reshuffling after all 73 parameters included have been degraded.
  4. Free/sync toggles between changing the parameters at an arbitrary pace set by rate, or note divisions based on the project’s tempo (therefore sync will only degrade while playing)
  5. Rate adjusts the rate of degradation when in free mode, and the time it takes to ramp up or down to new values when interpolate is on. Rate is milliseconds and ranges from 15ms to 2000ms.
  6. Below rate are the note durations for sync mode ranging from a 1/128th note up to a dotted whole note.
  7. Finally the degrade button starts the process while interrupt stops everything so when you hear something you like you can save the patch on the TX81Z.

The TX81Z has a fairly small buffer for MIDI values, so spraying values at it too quickly will generate the “MIDI Buffer Error”. However, even after getting the error it will continue listening to the incoming data, so even though it might be skipping a parameter here and there it lets me keep throwing things at it. The video below shows how the LCD display responds to the stream of values coming at the machine.

TX81Z Patch Degradation with Interpolation! #glitch #fmsynthesis

A video posted by John Keston (@jkeston) on

I’ve saved quite a few very interesting effects so far and have nearly run out of the 32 patch positions available on the unit. Perhaps the next step is to add a library feature especially since I’m not thrilled about the idea of saving patch banks to cassette!

Screen Shot 2015-08-08 at 6.46.35 PM