Background
As part of my engineering masters thesis, I created a handbalancing performance apparatus prototype called commensalisTECH symBIOsis (HAND*CS), which leverages a handbalancer's weight distribution data and muscle activation data to modulate sound and visual output. Because it was just a preliminary prototype, HAND*CS leaves much to be improved.
For BLUE, I propose an expansion and augmentation of HAND*CS. Part of the inspiration for HAND*CS comes from the work done between Laetitia Sonami and Rebecca Fiebrink, creators of Spring Spyre, a machine-learning-augmented musical instrument that collaborates with its performer in real-time. I would like to begin incorporating machine learning into HAND*CS to augment its expressive potential and expand upon its symbiotic potential with its performer. HAND*CS also currently has discrete visual and sonic output, and there remains significant potential for co-modulation between the two. By increasing robustness and expressive potential, HAND*CS presents an exciting new avenue for handstand performance, especially regarding improvisation while maintaining a cohesive performance space.