Low Latency Wired MIDI with iPad, Bitstream 3X, and QuNexus

Like many of you I have experimented with MIDI over wifi on the iPad. Mainly so that I can use a proper keyboard to play some of the splendid virtual instruments available on iOS. However, connecting to the iPad this way requires a computer on the same network, or instantiating an ad hoc network for the MIDI I/O. Secondly, although it can be stable it is usually slower and suffers from wireless network traffic congestion, hence more latency than a wired connection.

So what are the options? Over at CDM there’s a really great article from Nicolas Bougaïeff, the creative director at Liine (makers of the Lemur app and LiveControl) that explores a wide range of possibilities. Here my intent is to share what I have found works for me with a minimal investment in iOS specific hardware. Specifically, the Camera Connection Kit (CCK). The CCK essentially provides USB I/O for the iPad allowing class compliant MIDI devices to be connected and used. This works great, if you have such a device and all you want to do is use a hardware controller with the iPad.

The problem with the CCK is three fold – integration into more complex MIDI setups, class compliance, and power restrictions. Most USB MIDI devices only have a single USB port, so you can’t have a computer connected to the same controller to record the MIDI, or otherwise interface with iPad apps. As a result, two MIDI interfaces are required – one for the iPad and another for the computer for wired communications between them to be possible.

Secondly, many iOS hardware solutions are costly and impractical especially if you’d rather make use of equipment that most experienced producers have already invested in. Hardware that could work perfectly if only there were drivers available for iOS (I can dream, right?). One of my MIDI keyboard controllers, for example, is not class compliant over USB (an admittedly crappy CME UF7).

Third, beyond Apple’s sandboxing of iOS, even if your device is class compliant, they have set a very low limit on the power draw (10-100mA) often causing an annoying error to come up stating “The connected device requires too much power.” Fortunately there’s a hack/workaround for this problem. Simply attaching an unpowered USB hub between the CCK and your MIDI controller prevents the error from popping up and allows you to use your class compliant hardware. As an added benefit, the USB hub allows you to use multiple devices with your iOS device as long as they are either powered on their own, or they can draw enough power from the iPad/iPod Touch/iPhone. Side note: this works on iOS 6, but I do not know if it works on iOS 7 since I’m still waiting to do the update, so I’d love to hear from someone who has tried this.

Lately I have been researching potential solutions, and digging through my old MIDI hardware to see what kind of setup will work the best for me take to advantage of my favorite iOS music apps. Recently the thought occurred to me that perhaps I could use the Bitstream 3X MIDI controller (BS3X) with the iPad. I bought the BS3X back in the Spring of 2011 specifically to use as a controller for my Roland Super Jupiter MKS-80 analog synth, and therefore it is consistently available in my studio setup.

The BS3X is one of the most flexible and programmable MIDI controllers ever designed. It has traditional MIDI I/O including an in, two outs, and a thru jack. It’s even got a pre-MIDI Sync-24 jack. More importantly to this conversation is that is has class compliant USB MIDI I/O. The BS3X is designed to be an interface and controller in one and works perfectly to bridge my old hardware, iPad, and MacBook Pro together. Since everything is wired there’s little to no noticeable latency when playing a synth app with any MIDI keyboard plugged into the BS3X. Also, thanks to the plethora of I/O options on the BS3X, I can use the iPad and a collection hardware controllers (including my non-class compliant devices) with or without a computer in the chain. Furthermore integrating the iPad into this setup does not interfere with BS3X controlling the Roland MKS-80 because those controls send system exclusive messages (sysex) to the channel I have dedicated to the MKS-80.

In the video I have focused on illustrating how one might use two iPad synthesizer apps and a hardware synthesizer together including Cassini, Sunrizer, and the MKS-80. The BS3X is used as both the iPad interface and MKS-80 controller. No computer is required, but a simple change of cable allows for a computer to be integrated into this setup because the MOTU UltraLite interface and standalone mixer has MIDI I/O. In other words two MIDI interfaces are still necessary with a computer, but prior to this experiment I was only using the BS3X as a controller for the MKS-80 and bypassing the class compliant USB MIDI interface functionality. Since the USB hub was required I also added the QuNexus to the setup. This was dedicated to feeding notes into the arpeggiator in Cassini. The keyboard controller was split so that in the low end I could play the MKS-80 effect then tweak it with the BS3X knobs and sliders as it decayed. In the upper end of the same keyboard I played a lead sound programmed in Sunrizer.

iPad with QuNexus and AudioBus Demo

For the last few weeks I have be in the midst of remodeling my mixing studio leaving me with no place work on electronic music. Soon I’ll have my studio back, but whilst I’m waiting for the flooring to arrive I decided to take the opportunity to explore using the iPad exclusively (with various apps and the QuNexus) to produce and perform tracks.

I’ve been looking at a long list of apps, but for this video I chose to narrow things down to AudioBus, Samplr, Sunrizer, and Propellerheads Figure (not shown in the video). The process started in Figure where I created a drum pattern, pizzicato bass line, and synth melody. I then used Sonoma AudioCopy/Paste to get the parts into Samplr on separate tracks. Next I recorded an arpeggio from Sunrizer into Samplr that I used in two of the slots, but setup quite differently from each other. Finally I recorded a bit of vocal straight in via the internal mic to use as a effect. During the performance I used the touchscreen to manipulate the layers and adjust the processing in Samplr, then I used the QuNexus connected via the iPad Camera Kit to play a mono synth patch in Sunrizer.

For me this experiment serves as a proof of concept. The audio quality coming out of the iPad is not great, but it’s not bad either and can be improved by adding an audio interface. Any latency was hardly noticeable with the 4th generation iPad that I’m using. In another experiment I even had two controllers connected to the Camera Kit with a USB hub. Both running on bus power. Unfortunately this did cause the “connected USB device uses too much power” error to come up occasionally, but a powered USB hub will solve that.

So what’s the prognosis? In my view the iPad is approaching the professionalism of some laptop setups. If you need an ultraportable system there’s going to be some subset of apps and hardware out there that will serve most purposes. More interesting than that is the range of applications and the innovation that comes along many of them as a result of the multitouch interface. Samplr, for example, is really a pleasure to use and features seven different play modes that allow you to interact with the audio in unique and addicting ways. Beyond being just a scratch pad setups like these are a way to change one’s perspective and try a new approach to music making.

Is anyone else discovering intriguing new ways to produce using mobile devices? Whether it’s the iPad, OP-1, Android Tablet, or harmonica I would love to hear about what you’re learning and hear what you’re producing with these sorts of techniques.

Generative Sequence Driving MDA JX10 Emulator

I created the following generative sequence using GMS (click for details), during a solo performance at the Spark Festival of Electronic Music and Art, October 2010. One of the virtual instruments I used in the set is an Open Source, Roland Super JX10 emulator made by MDA. The Roland Super JX10 was one of the last great analog poly-synths produced by Roland, and the first Roland synth to receive velocity and aftertouch treatment on the 76 key keyboard. Although I never owned one of these, I have played one before, and I imagine that programming them was brain surgery without the optional PG-800 programmer. In the documentation for the MDA JX10 they state, “[this] plug-in is designed for high quality (lower aliasing than most soft synths) and low processor usage – this means that some features that would increase CPU load have been left out”. To me this plugin sounds very good. I’d like to hear from anyone who owns or has played a Roland Super JX10 for their perspective on this instrument.

MDA JX10 Emulator

Robot Conspiracy

I can’t seem to get enough robot action these days. Robots have lots of personality. Much more than politicians who convene in St. Paul. I used a similar technique to get this sound as I did for Robot Music. This time, however, I did a bit of processing after the fact, including pitching the recording down thirteen steps. Why thirteen? Because thirteen is a cool number. It’s subversive and pagan and not a floor in lots of buildings. I also added some standard reverberation and automated up some delay at the end to please my sense of aural space.

Robot Conspiracy

Robot Music

I produced this sound by playing one note in a virtual instrument called “Harmonic Dreamz” which is part of Pluggo by Cycling74. After that I automated random patch changes so that all of the twenty eight parameters included in the Harmonic Dreamz instrument were flying all over the place creating a frenetic passage of electronic mayhem. Then I arpeggiated the note with some slight randomness to the pattern and ended up with this.

To me it sounds as if it could be speech or perhaps singing in a robot language. I recorded several examples of it. Some of the other examples have slight variations and others have significant variations, so I may post some other versions at some point. This recording is in mono with no processing. The output is exactly what the virtual instrument produced given the parameters sent to the device.

Robot Instigator