WE INVESTIGATED INFLUENCES OF AUDITORY FEEDBACK, musical role, and note ratio on synchronization in ensemble performance. Pianists performed duets on a piano keyboard; the pianist playing the upper part was designated the leader and the other pianist was the follower. They received full auditory feedback, one-way feedback (leaders heard themselves while followers heard both parts), or self-feedback only. The upper part contained more, fewer, or equal numbers of notes relative to the lower part. Temporal asynchronies increased as auditory feedback decreased: The pianist playing more notes preceded the other pianist, and this tendency increased with reduced feedback. Interonset timing suggested bidirectional adjustments during full feedback despite the leader/follower instruction, and unidirectional adjustment only during reduced feedback. Motion analyses indicated that leaders raised fingers higher and pianists' head movements became more synchronized as auditory feedback was reduced. These findings suggest that visual cues became more important when auditory information was absent.
Sequential actions such as playing a piano or tapping in synchrony to an external signal put high cognitive and motor demands on producers, including the generation of precise timing at a wide variety of rates. Tactile information from the fingertips has been shown to contribute to the control of timing in finger tapping tasks. We addressed the hypothesis that reduction of timing errors is related to tactile afferent information in pianists' finger movements during performance. Twelve pianists performed melodies at four rates in a synchronization-continuation paradigm. The pianists' finger motion trajectories toward the piano keys, recorded with a motion capture system, contained different types and amounts of kinematic landmarks at different performance rates. One landmark, a finger-key (FK) landmark, can occur when the finger makes initial contact with the key surface and changes its acceleration abruptly. Overall, there were more FK landmarks in the pianists' keystrokes, as the performance rate increased. The pianists were divided into two groups: those with low percentages of FK in the medium rates that increased with increasing performance rate and those with persistently high FK percentages. Low-FK pianists showed a positive relationship between increased tactile feedback from the current keystroke and increased temporal accuracy in the upcoming keystroke. These findings suggest that sensory information available at finger-key contact enhances the timing accuracy of finger movements in piano performance.
This contribution gives an overview of the state of the art in the field of computational modeling of expressive music performance. The notion of predictive computational model is briefly discussed, and a number of quantitative models of various aspects of expressive performance are briefly reviewed. Four selected computational models are reviewed in some detail. Their basic principles and assumptions are explained and, wherever possible, empirical evaluations of the models on real performance data are reported. In addition to these models, which focus on general, common principles of performance, currently ongoing research on the formal characterisation of differences in individual performance style are briefly presented.
Nonverbal auditory and visual communication helps ensemble musicians predict each other’s intentions and coordinate their actions. When structural characteristics of the music make predicting co-performers’ intentions difficult (e.g., following long pauses or during ritardandi), reliance on incoming auditory and visual signals may change. This study tested whether attention to visual cues during piano–piano and piano–violin duet performance increases in such situations. Pianists performed the secondo part to three duets, synchronizing with recordings of violinists or pianists playing the primo parts. Secondos’ access to incoming audio and visual signals and to their own auditory feedback was manipulated. Synchronization was most successful when primo audio was available, deteriorating when primo audio was removed and only cues from primo visual signals were available. Visual cues were used effectively following long pauses in the music, however, even in the absence of primo audio. Synchronization was unaffected by the removal of secondos’ own auditory feedback. Differences were observed in how successfully piano–piano and piano–violin duos synchronized, but these effects of instrument pairing were not consistent across pieces. Pianists’ success at synchronizing with violinists and other pianists is likely moderated by piece characteristics and individual differences in the clarity of cueing gestures used.
Skilled piano performance requires considerable movement control to accomplish the high levels of timing and force precision common among professional musicians, who acquire piano technique over decades of practice. Finger movement efficiency in particular is an important factor when pianists perform at very fast tempi. We document the finger movement kinematics of highly skilled pianists as they performed a five-finger melody at very fast tempi. A three-dimensional motion-capture system tracked the movements of finger joints, the hand, and the forearm of twelve pianists who performed on a digital piano at successively faster tempi (7–16 tones/s) until they decided to stop. Joint angle trajectories computed for all adjacent finger phalanges, the hand, and the forearm (wrist angle) indicated that the metacarpophalangeal joint contributed most to the vertical fingertip motion while the proximal and distal interphalangeal joints moved slightly opposite to the movement goal (finger extension). An efficiency measure of the combined finger joint angles corresponded to the temporal accuracy and precision of the pianists’ performances: Pianists with more efficient keystroke movements showed higher precision in timing and force measures. Keystroke efficiency and individual joint contributions remained stable across tempo conditions. Individual differences among pianists supported the view that keystroke efficiency is required for successful fast performance.
As reported in the recent literature on piano performance, an emphasized voice (the melody) tends to be played not only louder than the other voices, but also about 30 ms earlier (melody lead). It remains unclear whether pianists deliberately apply melody lead to separate different voices, or whether it occurs because the melody is played louder (velocity artifact). The velocity artifact explanation implies that pianists initially strike the keys simultaneously; it is only different velocities that make the hammers arrive at different points in time. The measured note onsets in these studies, mostly derived from computer-monitored pianos, represent the hammer-string impact times. In the present study, the finger-key contact times are calculated and analyzed as well. If the velocity artifact hypothesis is correct, the melody lead phenomenon should disappear at the finger-key level. Chopin's Ballade op. 38 (45 measures) and Etude op. 10/3 (21 measures) were performed on a Bösendorfer computer-monitored grand piano by 22 skilled pianists. The hammer-string asynchronies among voices closely resemble the results reported in the literature. However, the melody lead decreases almost to zero at the finger-key level, which supports the velocity artifact hypothesis. In addition to this, expected onset asynchronies are predicted from differences in hammer velocity, if finger-key asynchronies are assumed to be zero. They correlate highly with the observed melody lead.
Skilled ensemble musicians coordinate with high precision, even when improvising or interpreting loosely defined notation. Successful coordination is supported primarily through shared attention to the musical output; however, musicians also interact visually, particularly when the musical timing is irregular. This study investigated the performance conditions that encourage visual signaling and interaction between ensemble members. Piano and clarinet duos rehearsed a new piece as their body motion was recorded. Analyses of head movement showed that performers communicated gesturally following held notes. Gesture patterns became more consistent as duos rehearsed, though consistency dropped again during a final performance given under no-visual-contact conditions. Movements were smoother and interperformer coordination was stronger during irregularly timed passages than elsewhere in the piece, suggesting heightened visual interaction. Performers moved more after rehearsing than before, and more when they could see each other than when visual contact was occluded. Periods of temporal instability and increased familiarity with the music and co-performer seem to encourage visual interaction, while specific communicative gestures are integrated into performance routines through rehearsal. We propose that visual interaction may support successful ensemble performance by affirming coordination throughout periods of temporal instability and serving as a social motivator to promote creative risk-taking.
Ensemble musicians often exchange visual cues in the form of body gestures (e.g., rhythmic head nods) to help coordinate piece entrances. These cues must communicate beats clearly, especially if the piece requires interperformer synchronization of the first chord. This study aimed to (1) replicate prior findings suggesting that points of peak acceleration in head gestures communicate beat position and (2) identify the kinematic features of head gestures that encourage successful synchronization. It was expected that increased precision of the alignment between leaders’ head gestures and first note onsets, increased gesture smoothness, magnitude, and prototypicality, and increased leader ensemble/conducting experience would improve gesture synchronizability. Audio/MIDI and motion capture recordings were made of piano duos performing short musical passages under assigned leader/follower conditions. The leader of each trial listened to a particular tempo over headphones, then cued their partner in at the given tempo, without speaking. A subset of motion capture recordings were then presented as point-light videos with corresponding audio to a sample of musicians who tapped in synchrony with the beat. Musicians were found to align their first taps with the period of deceleration following acceleration peaks in leaders’ head gestures, suggesting that acceleration patterns communicate beat position. Musicians’ synchronization with leaders’ first onsets improved as cueing gesture smoothness and magnitude increased and prototypicality decreased. Synchronization was also more successful with more experienced leaders’ gestures. These results might be applied to interactive systems using gesture recognition or reproduction for music-making tasks (e.g., intelligent accompaniment systems).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.