[source]

  + visual_main.py
  + visual_udp.py

[requirements]

  + Python 2.3
  + PIL
  + VideoCapture
  + twisted
  + PyOpenGL
Visual, Explained

    As we started to test the program, we used Processing as a means to capture and parse video data. This became an issue, since we were using Python to drive the visualization. Eventually we settled on The VideoCapture Library for Python, which ended working very well, and allowed for a lot more flexibility, speed, and stability.

[QuickTime Mov: 10.02mb]


    The final version of the visualization consists of two distinct portions, both interacting with the other objects in the space in different ways. Switching between the two visualizations is accomplished by pressing any three buttons on the Tactile Group's Neuron at any time. The first visualization shows a "time-lapse" view, where each non-dark pixel is shown as a white dot in the appropriate location, and frames move toward the viewer, with about ten total in view at any one time. This visualization is colorized by sound in the space, reacting to pitch. Also, triggering individual buttons on The Neuron will make individual frames grow temporarily.





    The second visualization is a literal representation of the space, with dots being mapped to video pixels, however when there is sound in the space, each dot grows into a line with the depth controlled by the amplitude of the audio. Triggering buttons on the neuron will cause the display to zoom quickly into the center, and releasing buttons will allow it to gradually return to normal.







05 . 25 . 2005 [visual_group] Art103 CADRE