May 10th 2005 - The White Gallery
May 17th 2005 - The Black Gallery
Synaesthesia is the crossing of the senses; seeing what you hear, hearing what you smell, and tasting what you touch. It was with this in mind that we created a system of sensory exchange and translation for this project.
The visual output is driven by a small video camera placed within the gallery space. We had started with the idea of using infrared LEDs, however as the project progressed, it became clear that it was unnecessary. Because the video is limited to a two-color, black-and-white image for bandwidth reasons, there is plenty of contrast, and it's easy to see figures and other motion clearly on the display.
The output was generated by iterating through each video frame as it is passed from the camera and assigning each pixel a "1" or a "0" based on a threshold setting. This allows the display to be "tweaked" to each space that it's installed in. Ultimately the output ends up as a very long, linebreak-delimited string of one and zeros, with each digit representing a pixel. This output method was chosen because it allowed each group to do their own manipulations on the dataset without interference. More information here...
In delievering the synaesthetic experience, the visual data (0's and 1's) is used like its boolean nature: determining the on and off of interactivity where the tactile data directly affects the dynamic levels of the beeps. The series of beeps are strictly determined by the interactivities from the data inputed. The audio output translates the tactile and visual information into series of beeps allowing the audience to have an audial experience of the hugs from the plush neuron and the interaction in the atmosphere.
In pursuit of an unconventional contextually acute interface, the tactile component consists of a nine-foot huggable plush neuron. Sheathed below the soft, lustrous surface of the 'axon' are five impressible sensors detecting user contact. Strategically distributed, the piece requires full-body embracement to activate all five of these sensors simultaneously. The degree of user contact is detected and transmitted to the audio and sound group.
Input data received from the audial group is expressed through two modified DC motors. High amplitudes activate the vibration of the motor located in the neuron's 'nucleus', while high frequencies prompt the locomotion of the rolling motor located at the base of the 'axon'. Visual data received is analyzed and parsed, interpreting the visual field in three vertical divisions, informing the lighting of corresponding LEDs in the 'dendrites'. More information here...