Full Program »
Grab-and-play mapping: Creative machine learning approaches for musical inclusion and explorationWe present the first implementation of a new tool for prototyping digital musical instruments, which allows a user to literally grab a controller and turn it into a new, playable musical instrument almost instantaneously. The tool briefly observes a user interacting with a controller or sensors (without making any sound), and then it automatically generates a mapping from this observed input space to the control of an arbitrary sound synthesis program. The sound is then immediately manipulable using the controller, and this newly-created instrument thus invites the user to begin an embodied exploration of the newly-created relationships between human movement and sound. We hypothesize that this approach offers a useful alternative to both the creation of mappings by programming and to existing supervised learning approaches that create mappings from labeled training data. We have explored the potential value and trade-offs of this approach in two preliminary studies. In a workshop with disadvantaged young people who are unlikely to learn instrumental music, we observed advantages to the rapid adaptation afforded by this tool. In three interviews with computer musicians, we learned about how this "grab-and-play" interaction paradigm might fit into professional compositional practices.
Department of Computing
Goldsmiths, University of London