Here’s a video of me playing with my Gestural Music Sequencer. I’ll upload a better version at some point, but I think you can a least get a sense of how you might use this kind of tool (I’m talking about the sequencer, not the performer). As you can see the video has been mirrored so it’s easier to follow your own movements.
To reiterate how the sequencer works, the X axis of the brightest pixel is used to determine the pitch, while the Y axis is the dynamics. The application outputs MIDI data that I’m routing to Reason. I’ve programmed the up and down arrows on the keyboard to increase or decrease a multiplier that along with the 15 fps frame rate determines the time between each note on event. I’ve also enabled a group of keys to adjust the transposition.