HyperSense Complex was an ensemble of performers (Dr. Alistair Riddell, Somaya Langley and Simon Burton) utilising custom-built sensor and micro-controller technology in a networked performance environment. The technology was developed to be wearable, providing a more intuitive human computer interface. This in turn offered an alternative approach to the experience of live electronic music performance, for both performers and audience. In a performance context, the audience experiences movement of the performer’s hands and bodies as controlling sonic environment.
Each performer wore eight flex-sensors strapped to the fingers. Movement of the sensors on the fingers is detected and generates signal, which is sent via cabling running along the arms to the micro-controllers. The micro-controllers convert signal to data using ADC. This is passed over extended USB connections to a Macintosh G4 PowerBook running Python scripting. The Python code interprets each of the performer data streams and interpolates the data into a composition framework and the output is sent via OSC to a second laptop running SuperCollider3. The second laptop handles control of audio samples, live audio signal input and effects.
- Writers:
-
- Date written:
- 2015
- Last updated:
- 2015