Developing methods and tools to sense musical and gaming gestures
We have seen that there is a massive corpus of research on gestures, which amounts to a branch of knowledge known as HCI, but this corpus of research should be carefully examined and adapted in order to be used in the musical and gaming domains, due to the idiosyncrasies of each field. So I'd like to organise a workshop about how advanced sensing, instrumentation and signal processing techniques could inform the research on musical and gaming gestures. Furthermore, some of the considerations here could be relevant for designing any context-aware system that provides real-time user feedback and relies on continuous sensing and classification.
A draft plan for the workshop day:
1. Theory
1.1 Music-related body movement (definition of musical gesture)
1.2 Body interaction with games
2. Exploration
2.1 Data capture: sensors and instrumentation methods in devices for
-Music expression
-Gaming
2.2 Data storage
2.3 Data representation
3. Development
Hands-on introduction of Bela, an open source embedded platform for ultra-low latency audio and sensor processing based on the BeagleBone Black.
For part 1, we could invite speakers to give a talk. Each section in part 2 could begin with a presentation and follow a group discussion. As for part 3, we will present the hardware and software features of Bela through a tutorial that gets participants started developing interactive music projects.
Since my own research is about how to quantify the interaction between performer and acoustic instrument, I'd like to hear more ideas about game intelligence. Any thought is welcome!