The code required for (see.saw) runs something like this:
visual part
> track finger position as CCV (formerly known as tbeta) events
> output TUIO data
> input to Processing
> use cursor position to invoke collections of particles and ellipses
> event duration determines color phase
> output to screen (this has to happen in ‘real time‘)
audio part
> record sample of real time sound
> hold sample in buffer 15 sec in max/msp
> play sample through speaker at opposite entrance
——————————————————–