2020
DOI: 10.1371/journal.pone.0237826
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian process inference modelling of dynamic robot control for expressive piano playing

Abstract: Piano is a complex instrument, which humans learn to play after many years of practice. This paper investigates the complex dynamics of the embodied interactions between a human and piano, in order to gain insights into the nature of humans' physical dexterity and adaptability. In this context, the dynamic interactions become particularly crucial for delicate expressions, often present in advanced music pieces, which is the main focus of this paper. This paper hypothesises that the relationship between motor c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…As an interdisciplinary field, robot music accompaniment studies have attracted the attention of computer scientists, musicians, and artists, as well as bringing new possibilities for robot applications and music education. In this field, many researchers have achieved significant results, including the development of algorithms that can automatically create music [ 28 ], the combination of robots and musical instruments to achieve human‒machine collaboration [ 29 ], and the design of intelligent systems that can understand music and dance [ 30 ]. PepperOSC [ 31 ] connects the Pepper and NAO robots by leveraging sound production tools, which improves the effectiveness and attractiveness of human-robot interaction.…”
Section: Related Workmentioning
confidence: 99%
“…As an interdisciplinary field, robot music accompaniment studies have attracted the attention of computer scientists, musicians, and artists, as well as bringing new possibilities for robot applications and music education. In this field, many researchers have achieved significant results, including the development of algorithms that can automatically create music [ 28 ], the combination of robots and musical instruments to achieve human‒machine collaboration [ 29 ], and the design of intelligent systems that can understand music and dance [ 30 ]. PepperOSC [ 31 ] connects the Pepper and NAO robots by leveraging sound production tools, which improves the effectiveness and attractiveness of human-robot interaction.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, Fei et al (2019) showed how a robot played an instrument called a dulcimer with a self-learning method whose training relied on data associated with three types of information: the tone of the adjacent notes, the time interval in a musical piece, and the decision results in the real processes of performance by human beings. Scimeca et al (2020), in turn, utilized a minimalist experimental platform based on a robotic arm equipped with a single elastic finger to examine on a systematic basis both motor control and the outcome resulting from piano sounds. Miller (2020) experimented with robots that played live neoclassical jazz combined with free improvisation.…”
Section: Robotics and Musicmentioning
confidence: 99%
“…Piano playing is a challenge that is particularly interesting for humans as it requires extreme dexterity, adaptability, and behavioral richness to achieve a range of expressive playing styles [49]. A number of researchers have aimed to build both physical prototypes [50][51][52][53][54][55][56][57], data-driven virtual prototypes [49,58] and non-data-driven simulation approach [59] to manipulate this complex musical instrument. However, they have failed to mimic the biological neural and muscular activities as well as the complex passive dynamics between the interaction of a physiological-accurate body and a piano, resulting in a significant reality gap between the model and physics.…”
Section: Introductionmentioning
confidence: 99%