Developer
Intelligent Accompanist
Introduction
Intelligent Accompanist is a performance system for generative accompaniment in real time.
Description
This system is created in Unity 3D, and utilizes a PureData patch for real time pitch detection. Through analaysis of the user's notes, the system gets the negative harmony and note length to play back an accompanying melody. Visually the system consists of a sphere which acts as a first person character, and a cluster of cubes which represent the notes played by the AI instruments, when a note is played, the sphere moves toward the respective cube. Furthermore, a granular synthesizer that allows for recording and granularizing samples during performance is implemented. The granular synthesizer is controlled by MiMu Gloves through OSC messages received by Unity, and allow for control of effect parameters.
Bela Glove
Introduction
Bela Glove is a MIDI glove controller using the Bela platform.
Sensor Design
The glove has four flex sensors that act as variable resistors, going in to the analog input pins 0 through 3. In addition, it also includes an accelerometer that measures the orientation of the glove. The accelerometer measures X, Y, and Z position and is also plugged in to the analog inputs 4 through 7.
Performance Implementation
The data gathered from the analog ins is mapped to MIDI ranges 0-127, and are sent as continuous controllers in to max and TouchDesigner. The max patch uses drunken randomness and a quantizer to play notes the first two flex sensors control rate. The third and fourth flex sensors are used to control amplitude and filter cutoff, and the accelerometer adjusts the filter resonance. In addition, this data is also received by TouchDesigner which includes a rigged 3D model of a hand, an underwater environment, and particle effects for a more interesting performance.
MIDI Painter
Introduction
The idea behind this project largely came from Kentaro Suzuki's M4L device "LFO Sketch". Where drawn lines become an LFO of sorts. I wanted to expand on this idea of drawing for sonic control via touch devices like the iPhone and iPad.
Design Process
To create the app, I started by looking in to line drawing apps/games that used user drawn lines to designate a path for game objects to follow. In order to accomplish this, I found SpriteKit to be the most sensible option for developing my app. For this app, I created a variety of functions for creating lines, circles, and actions. Lines are created by accessing the pathArray, an array of CGPoint's that are appended on every time the user moves their finger on the screen. Once the user lifts their finger, the line is created with a corresponding circle that follows the line's path via the AKAction, "Follow". On the user's touch down, the pathArray is cleared, and MIDI messages that follow the circle's position are sent. Within the touchUp function, also lies the operations for removing lines. To circumvent confusion between creating a line and removing a line, the program checks if the user is touching an existing line before creating a line. If the user is touching an existing line, the remove function is called. This function removes the line and circle from it's parent node, along with removing the circle from the circleArray to update the CC numbers.
Augmented Viola
Description
A MIDI controller imbedded in to a viola, using arduino on the Teensy 3.2 microc ontroller, and trill craft touch sensors.
How It Works
This controller sends continuous MIDI note values, utilizing an ultrasonic distance sensor, and Bela's trill craft sensor. The Trill craft is attatched to copper tape on the finger board of the viola to detect MIDI note values.
Procedural Pinball
Description
A pinball game made in Unity3D, using procedurally generated sounds.
How It Works
A simple 3D pinball game, this project experiments with the use of procedurally generated sound in Unity. By using the OnAudioFilterRead, I was able to create an oscillator through C# code. When the ball interacts with colliders, it triggers randomized note values that are then quantized to a scale and sent to the oscillator script.
Musical Playground
Description
An interactive, ambisonic composition created in Unity3D.
How It Works
This piece utilizes Unity's 3D audio features to create an immersive musical experience that includes ambisonic audio. The environment includes a ball that plays different ambient backgrounds based on the color floor its touching. In addition three instruments, a marimba, chimes, and drums are in the environment and all respond to the user's mouse cursor.