Demonstrating Interactive Machine Learning Tools For Rapid Prototyping of Gestural Instruments in The Browser

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Demonstrating Interactive Machine Learning Tools for

Rapid Prototyping of Gestural Instruments in the Browser


Adam Parkinson Michael Zbyszynski Francisco Bernardo
Goldsmiths Goldsmiths Goldsmiths
University of London University of London University of London
a.d.parkinson@gmail.com m.zbyszynski@gold.ac.uk f.bernardo@gold.ac.uk

ABSTRACT for Musical Expression) [3]. From Max Mathews’s Radio-Baton to


These demonstrations will allow visitors to prototype gestural, Michel Waisvisz’s Hands [4], computer musicians have explored
interactive musical instruments in the browser. Different browser novel ways of interacting with computers.
based synthesisers can be controlled by either a Leap Motion sensor Machine learning could provide a solution for connecting the
or a Myo armband. The visitor will be able to use an interactive embodied knowledge of musicians to the multiple parameters of a
machine learning toolkit to quickly and iteratively explore different software instrument. Interactive Machine Learning (IML), in
interaction possibilities. particular, allows for intuitive control of complex systems which
The demonstrations show how interactive, browser-based machine gives particular ability to refine systems to the end-user [2].
learning tools can be used to rapidly prototype gestural controllers Musicians can therefore bring their embodied knowledge to
for audio. gestural controllers, refining their interactions with a software
instrument through multiple iterations. Without typing a line of
These demonstrations showcase RapidLib, a browser based code, complex and expressive musical interactions can be created.
machine learning library developed through the RAPID-MIX
project.

1.   INTRODUCTION
Music is an ideal use case for machine learning, for two primary
reasons.
1.   Music software often has a complex interface and
multiple parameters that a performer might want to
control in order to expressively modulate a sound.
2.   Musicians often have precise, embodied knowledge
about gestural interaction and control of sound,
developed through years of instrumental
practice.
Many of the ways in which we interface with audio software on
computers fail to facilitate the effective control of multiple Figure 1. Leap Motion running in CodeCircle, tracking one
parameters, and do not exploit the rich gestural language of hand.
musicians. For instance, keyboard and mouse offer limited
interaction possibilities. Trying to adjust arrays of knobs and sliders 2.   SOFTWARE
on a software synthesiser GUI using just mouse clicks can be The demonstrations run in CodeCircle using the MaxiLib and
awkward and cumbersome. RapidLib libraries.
Musicians therefore often use specialist interfaces for musical CodeCircle is an online editor that enables real-time collaborative
control of computers. Commercially available MIDI keyboards and coding (see Figure 1). It is geared towards creative contexts and
control surfaces allow for a wider range of interaction possibilities works with HTML, CSS, JavaScript and several third-party media
than keyboard and music. More adventurous modes of interaction libraries. CodeCircle was developed by Fiala, Yee-King and
are explored through conferences such as NIME (New Interfaces Grierson [1].
Synthesis is handled by Mick Grierson’s Maximilian which runs in
CodeCircle as a JavaScript Library, MaxiLib.1 MaxiLib and
Maximilian are open source libraries for audio synthesis and signal
Licensed under a Creative Commons Attribution 4.0 International License (CC BY processing. They contain standard waveforms, sample playback,
4.0). Attribution: owner/author(s). filters with resonance, delay lines, FFTs, granular synthesis and low
Web Audio Conference WAC-2017, August 21–23, 2017, London, UK. level feature extraction.
© 2017 Copyright held by the owner/author(s).

1
http://maximilian.strangeloop.co.uk/
Interactive machine learning is handled by RapidLib, a machine 1× table (approx 1.2m * 0.6m)
learning library in CodeCircle. RapidLib was developed through
the Real-time Adaptive Prototyping for Industrial Design of We provide, as a minimum:
Multimodal Interactive and eXpressive technologies (RAPID- 2× laptops
MIX) project.2
1× Myo Armband
CodeCircle running with RapidLib and MaxiLib provides an ideal
environment for collaboratively developing instruments in the 1× Leap Motion
browser. Little installed software is required and the envrionment
will run on any computer with internet access and a modern Headphones or speakers.
browser. The interface remains the same in Linux, OSX and WiFi is desirable, but the demos can run offline.
Windows and completed projects can be exported and run offline.
A project can be accessed and forked by multiple users, and
documents are reactively updated.

3.   HARDWARE
There are two pieces of hardware used in the demonstration, a Myo3
armband and a Leap Motion4 controller. These allow for a range of
gestural interactions with a computer.
The Myo armband is developed by Thalmic Labs. It is worn on the
forearm, and communicates with the computer via Bluetooth. It has
8 channels of EMG (Electromyographic) data, accelerometer,
gyroscope and magnetometer.
The Leap Motion is a USB device for gesture tracking, which uses
a 3d array of cameras. The device can give tracking information for
Figure 2. An interactive music performance by Atau
two hands and ten fingers, giving real time updates at a rate of
Tanaka using two Myo armbands.
around 200 frames per second.
These hardware devices represent the state of the art in affordable
6.   ACKNOWLEDGMENTS
The work reported here was supported in part by RAPID-MIX, an
consumer products that offer real time, gestural interaction with
EU Horizon 2020 Innovation Act H2020-ICT-2014-1 Project ID:
computers, and can be used as musical interfaces (Figure 2).
644862.
4.   INTERACTION 7.   REFERENCES
Visitors will be able to choose from a several different browser
[1] Fiala, J., Yee-King, M., and Grierson, M. 2016. Collaborative
based synthesisers and samplers running in CodeCircle.
coding interfaces on the web. In Proceedings of the
The visitor will be free to explore their chosen synthesiser, finding International Conference on Live Coding.
different sounds according to their taste. Using either the Myo
[2] Fiebrink, R., and Cook, P. R. 2010. The wekinator: a system
armband or the Leap Motion, the visitor will create a set of gestures
for real-time, interactive machine learning in music. In
associated with desired or interesting sounds. The system will be
Proceedings of The Eleventh Interantional Society for Music
then trained using these sets of sounds and gestures, and within a
Information Retrieval Conference (ISMIR 2010).
few seconds the visitor will be able to experiment with an
interactive space. The system can then be refined as it is trained [3] Jensenius, A. R., and Lyons, M. J. (Eds). 2017. A NIME
with further combinations of sound and gesture. Reader: Fifteen Years of New Interfaces for Musical
Expression. Springer, Berlin.
A selection of demos are accessible through
http://doc.gold.ac.uk/eavi/rapidmixapi.com/index.php/examples/ja [4] Tanaka, A. Sensor-based musical instruments and interactive
vascript/ music. 2011. In Dean, R. (Ed) The Oxford Handbook of
Computer Music. Oxford University Press, New York.
5.   TECHNICAL REQUIREMENTS
The demo can be scaled down or up to save space or be accessible
to more visitors, as is required. The following allows for two
simultaneous demonstrations:
1× mains power socket

2 4
http://rapidmixapi.com/ https://www.leapmotion.com/
3
https://www.myo.com/

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy