The article is about AI research in the context of a complex artistic behavior: expressive music performance. A computer program is presented that learns to play piano with 'expression' and that even won an international computer piano performance contest. A superficial analysis of an expressive performance generated by the system seems to suggest creative musical abilities. After a critical discussion of the processes underlying this behavior, we abandon the question of whether the system is really creative, and turn to the true motivation that drives this research: to use AI methods to investigate and better understand music performance as a human creative behavior. A number of recent and current results from our research are briefly presented that indicate that machines can give us interesting insights into such a complex creative behavior, even if they may not be creative themselves.
One of the main difficulties in studying expression in musical performance is the acquisition of data. While audio recordings abound, automatically extracting precise information related to timing, dynamics, and articulation is still not possible at the level of precision required for large-scale music performance studies. In 1989, the Russian pianist Nikita Magaloff performed essentially the entire works for solo piano by Frédéric Chopin on a Bösendorfer SE, a computer-controlled grand piano that precisely measures every key and pedal action by the performer. In this paper, we describe the process and the tools for the preparation of this collection, which comprises hundreds of thousands of notes. We then move on to presenting the results of initial exploratory studies of the expressive content of the data, specifically effects of performer age, performance errors, between-hand asynchronies, and tempo rubato. We also report preliminary results of a systematic study of the shaping of particular rhythmic passages, using the notion of phase-plane trajectories. Finally, we briefly describe how the Magaloff data were used to train a performance rendering system that won the 2008 Rencon International Performance Rendering Contest.
This article presents research towards automated computational analysis of large corpora of music performance data. In particular, we focus on betweenhand asynchronies in piano performances-an expressive device in which the performer's timing deviates from the nominally synchronized timing of the score. Between-hand asynchronies play an important role, particularly in Romantic music, but they have not been assessed quantitatively in any substantial way. We give a first report on a computational approach to analyzing a unique corpus of historic performance data: basically the complete works of Chopin, performed by the Russian-Georgian pianist Nikita Magaloff. Corpora of that size-hundreds of thousands of played notes with substantial expressive (and other) deviations from the written score-require a level of automation of analysis that has not been attained so far. We describe the required processing steps, from converting scanned scores into symbolic notation, to score-performance matching, definition, and automatic measurement of between-hand asynchronies, and a computational visualization tool for exploring and understanding the extracted information.Temporal asynchronies between the members of musical ensembles have been found to exhibit specific regularities: The principal instruments in classical wind and string trios tend to be 30-50 msec ahead of the others (Rasch 1979); soloists in jazz
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.