When Yamaha demonstrated their AI allowing a dancer to play the piano with movement in a Tokyo concert hall in November 2017, it was the latest example of the ways in which computers are increasingly getting involved in music-making. We’re not talking about the synthesizers and other CPU-based instruments in contemporary music. We’re talking about computers’ potential as composers’ tools, or as composers themselves.
Yamaha’s use of AI is quite different. In the recent performance, world-renowned dancer Kaiji Moriyama was outfitted with electrodes on his back, wrists, and ankles, and set free to express himself as AI algorithms converted his movements into musical phrases for transmission to Yamaha’s Disklavier piano via MIDI messages. (MIDI is a computer language through which musical instruments can be controlled.)
Yamaha’s AI, which they’re still developing, worked with a database of linked musical phrases from which it selected and drew melodies for sending to the instrument based on Moriyama’s motions.
Moriyama was accompanied onstage by the Berlin Philharmonic Orchestra Scharoun Ensemble.
We’ve written previously about other musical AI, the web-based AI platform called Amper that uses AI to compose passages based on descriptors of style, instrumentation, and mood. Singer/songwriter Taryn Southern was using Amper as her primary collaborator in writing an album.
Another fascinating avenue being explored is the use of brain-to-computer (BCI) interfaces that allow wearers to think music into existence. It’s a fascinating way for anyone to play music, but it’s especially promising for people whose physical limitations make the creation of music difficult or even impossible otherwise.
Certain electroencephalogram signals correspond to known brain activities such as the P300 ERP (for “event-related potential”) that signifies a person’s reaction to some stimulus. It’s previously been by brain-computer interface (BCI) applications in spelling, operating local environmental controls, operating web browsers, and for painting. In September 2017, researchers led by BCI expert Gernot Müller-Putz from TU Graz’s Institute of Neural Engineering published research in PLOS/One describing their “Brain Composer” project that leveraged P300 to bring musical ideas directly from composers’ mind to notated sheets of music. They work in collaboration with MoreGrasp and “Feel Your Reach”.