Peter Coombs
Audiovisual Interactive Instrument
This audiovisual interactive instrument (A.I.I) uses Microsoft Kinect, to take tracking data from the Kinect and transfer it into values. This in turn allows the user to interact with the visuals it creates along with audio. For best results the A.I.I needs to be used in conjunction with an external webcam and two monitors. By using Synapse and the patch Kinect-Via-Synapse together, these will allow the A.I.I to gather the relevant data from the Kinect and translate it into X and Y vertices which will in turn be applied onto the visuals.
HOW TO USE
Before opening the required patches, ensure that your Microsoft Kinect is plugged in and turned on; then run the program Synapse. Next, for best results, place the external webcam directly above the Kinect. If you are unable to do this, make sure to angle of your built in webcam towards the area in which you will be standing.
Then, run the patch Kinect-Via-Synapse and enable ‘Track all Joints’; then run the A.I.I patch. Move the OpenGL pop-up window over to the second monitor and press ‘esc’ to enter fullscreen. Next turn on DSP and the patch; strike the PSI pose (standing with both arms above the head) and the patch will then start tracking and the visuals will initiate.
Conception
The idea behind this project was to experiment with how to maximize the Kinect’s potential in more creative ways and to open the door to its use as a live performance instrument. Initially I looked into ways to access data from the Kinect so that it would work with Max/MSP. This is when I found Synapse; an application designed for Mac and Windows which allows easy use of the Kinect with Max/MSP and other softwares. After downloading Synapse I became aware of Kinect-Via-Synapse, a patch which does all the routing required for motion tracking. By using ‘send’ and ‘receive’ objects in Max I was able to control the variables within my patch. Next I wanted to implement visuals to work alongside the audio. I searched through Cycling 74’s forums and tutorials, and I found a patch which used jitter physics to create a multi-particle physics display which made stunning visuals, full of colour where the aesthetics fits with the audio. However at this point I felt like I wanted to add more to this patch to allow it to stand out and become more specific to each individual user. So I thought about using an external webcam placed above the Kinect to capture red, green and blue (RGB) values; as the Synapse and Kinect-Via-Synapse didn’t access the Kinect's RGB camera. The RGB data gathered could then be used to change the timbre of the audio by manipulating the presence of certain wave types used. This meant that each individual player would get a different experience depending on the clothes they were wearing.




How it Works
The A.I.I works by using three patches made in Max 7. The first two of these are ones that were found online to help with the routing of the Kinect data, as the Kinect doesn’t output values that are read natively by Max 7 or Mac.
The first patch is Kinect-Via-Synapse made in 2011. This patch routes the tracking data from the programme Synapse and displays it for the user. It does this by using certain OpenNI applications which translates the data taken from the Kinect to open sound control (OSC) data, and this is how the Max/MSP interface communicates.
The second patch is Force-System published by Ben B in 2012 for Cycling ‘74’s Physics Patch-a-Day tutorial page. This patch uses four jit.phys.ghost objects which act as force points. These allow the generated particles to gravitate towards and around each force point. The particles are generated by jit.phys.multiple and jit.gl.multiple objects where the number of particles, their torque, colour, scale and mass are determined by a jit.noise object. This patch also contains a moveable force point which can interact with the particles.
This part of the patch was modified by duplicating the moveable force point and by using the scaled values from Kinect-Via-Synapse to cause the points to move based on those values. Also included is a colour changing parameter for the moveable force points which uses the RGB data from an external webcam placed above the Kinect.
The third patch is built to combine the first two patches by using the data from Kinect-Via-Synapse to run the Force-System patch in a number of different ways. It does this by taking the tracking data from Kinect-Via-Synapse and scaling it to the resolution of the screen being used in order to assign a suitable range for the moveable force points to be rendered around the display. It does this by using the $ argument in the message box which describes the moveable force points. This allows for the argument to be changed, and in this case this would be the X and Y coordinates. This is achieved by scaling the output of the X and Y outputs from Synapse, to the resolution of the screen being used to display the visuals. The same technique was used to change the colour information of the moveable force points. This uses colour information from the external webcam, by using a suckah object over a jit.pwindow which receives the colour data of a specific area of the jit.pwindow via another message box.
The audio is generated by three channels each of which contain two audio generators. One uses white noise generators which are individually processed with a band pass filter with a preset Q of ‘23’, but the center frequency is controlled by the Kinect’s output. The other half of the audio is a series of four, sin, saw or triangle wave generators whose frequencies are being controlled by a ‘random’ object. However, the range of randomised values are also controlled by the Kinect. This gives an overall impression of frequency change, but with a desired ambience. To add ambience, each of the three tracks are fed through individual reverbs, and a delay before the signals are added together and sent to a digital to analog converter. Each audio track has an overall volume which is controlled by the RGB data fed to it.This allows a colour that is pure red to produce a signal from the ‘red’ audio generator, so that the timbre would contain only a sin wave and white noise.
There are many potential applications of the A.I.I such as for live performance, multimedia or dance, even as part of an installation. It is my intention that people could take this concept and develop it into their own works.
Links:
Synapse - http://synapsekinect.tumblr.com/
Kinect-Via-Synapse - https://cycling74.com/toolbox/kinect-via-synapse-max-interface/
Download Max Patch .zip by clicking here --->