Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants’ movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.
Music has the capacity to induce movement in humans. Such responses during music listening are usually spontaneous and range from tapping to full-body dancing. However, it is still unclear how humans embody musical structures to facilitate entrainment. This paper describes two experiments, one dealing with period locking to different metrical levels in full-body movement and its relationships to beat- and rhythm-related musical characteristics, and the other dealing with phase locking in the more constrained condition of sideways swaying motions. Expected in Experiment 1 was that music with clear and strong beat structures would facilitate more period-locked movement. Experiment 2 was assumed to yield a common phase relationship between participants' swaying movements and the musical beat. In both experiments optical motion capture was used to record participants' movements. In Experiment 1 a window-based period-locking probability index related to four metrical levels was established, based on acceleration data in three dimensions. Subsequent correlations between this index and musical characteristics of the stimuli revealed pulse clarity to be related to periodic movement at the tactus level, and low frequency flux to mediolateral and anteroposterior movement at both tactus and bar levels. At faster tempi higher metrical levels became more apparent in participants' movement. Experiment 2 showed that about half of the participants showed a stable phase relationship between movement and beat, with superior-inferior movement most often being synchronized to the tactus level, whereas mediolateral movement was rather synchronized to the bar level. However, the relationship between movement phase and beat locations was not consistent between participants, as the beat locations occurred at different phase angles of their movements. The results imply that entrainment to music is a complex phenomenon, involving the whole body and occurring at different metrical levels.
Listening to music makes us move in various ways. Several factors can affect the characteristics of these movements, including individual factors and musical features. Additionally, music-induced movement may also be shaped by the emotional content of the music, since emotions are an important element of musical expression. This study investigates possible relationships between emotional characteristics of music and music-induced, quasi-spontaneous movement. We recorded music-induced movement of 60 individuals, and computationally extracted features from the movement data. Additionally, the emotional content of the stimuli was assessed in a perceptual experiment. A subsequent correlational analysis revealed characteristic movement features for each emotion, suggesting that the body reflects emotional qualities of music. The results show similarities to movements of professional musicians and dancers, and to emotion-specific nonverbal behavior in general, and could furthermore be linked to notions of embodied music cognition. The valence and arousal ratings were subsequently projected onto polar coordinates to further investigate connections between the emotions of Russell’s (1980) circumplex models and the movement features
Previous studies have found relationships between music-induced movement and musical characteristics on more general levels, such as tempo or pulse clarity. This study focused on synchronization abilities to music of finely-varying tempi and varying degrees of low-frequency spectral change/flux. Excerpts from six classic Motown/R&B songs at three different tempos (105, 115, and 130 BPM) were used as stimuli in this experiment. Each was then time-stretched by a factor of 5% with regard to the original tempo, yielding a total of 12 stimuli that were presented to 30 participants. Participants were asked to move along with the stimuli while being recorded with an optical motion capture system. Synchronization analysis was performed relative to the beat and the bar level of the music and four body parts. Results suggest that participants synchronized different body parts to specific metrical levels; in particular, vertical movements of hip and feet were synchronized to the beat level when the music contained large amounts of low-frequency spectral flux and had a slower tempo, while synchronization of head and hands was more tightly coupled to the weak flux stimuli at the bar level. Synchronization was generally more tightly coupled to the slower versions of the same stimuli, while synchronization showed an inverted u-shape effect at the bar level as tempo increased. These results indicate complex relationships between musical characteristics, in particular regarding metrical and temporal structure, and our ability to synchronize and entrain to such musical stimuli.
This paper reports a study of the forms and functions of head movements produced in the dimension of depth in Finnish Sign Language (FinSL). Specifically, the paper describes and analyzes the phonetic forms and prosodic, grammatical, communicative, and textual functions of nods, head thrusts, nodding, and head pulls occurring in FinSL data consisting of a continuous dialogue recorded with motion capture technology. The analysis yields a novel classification of the kinematic characteristics and functional properties of the four types of head movement. However, it also reveals that there is no perfect correspondence between form and function in the head movements investigated.
The current study explores how individuals' tendency to empathize with others (trait empathy) modulates interaction and social entrainment in dyadic dance in a free movement context using perceptual and computationally derived measures. Stimuli consisting of 24 point-light animations were created using motion capture data selected from a sample of 99 dyads, based on self-reported trait empathy. Individuals whose Empathy Quotient (EQ) scores were in the top or bottom quartile of all scores were considered to have high or low empathy, respectively, and twelve dyads comprised of four high-high, four low-low, and four high-low empathy combinations were identified. Animations of these dyads were presented to 33 participants, who rated the degree of interaction and movement similarity for each stimulus. Results showed a significant effect of empathy combination on perceived interactivity and perceived similarity. High-low stimuli were rated as significantly more interactive than either high-high or low-low stimuli, while high-high stimuli were rated as significantly less similar than high-low and low-low. Dyads’ period-locking, bodily orientation and amount of hand movement were all significantly correlated with rated amount of interaction, while rated similarity only related significantly to period-locking. Results suggest that period-locking is important for social entrainment to be perceived, but that other signals such as bodily orientation and hand movement also signal social entrainment during free dance movement.
Musical tempo is most strongly associated with the rate of the beat or "tactus," which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67-2hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake, Gros, & Penel 1999;London 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al. 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco & Kingston 2004;Spence 2015). A five-part experiment was performed to assess the integration of auditory and visual information in judgments of musical tempo. Participants rated the speed of six classic R&B songs on a seven point scale while observing an animated figure dancing to them. Participants were presented with original and time-stretched (±5%) versions of each song in audio-only, audio+video (A+V), and video-only conditions. In some videos the animations were of spontaneous movements to the different time-stretched versions of each song, and in other videos the animations were of "vigorous" versus "relaxed" interpretations of the same auditory stimulus. Two main results were observed. First, in all conditions with audio, even though participants were able to correctly rank the original vs. timestretched versions of each song, a song-specific tempo-anchoring effect was observed, such that sped-up versions of slower songs were judged to be faster than slowed-down versions of faster songs, even when their objective beat rates were the same. Second, when viewing a vigorous dancing figure in the A+V condition, participants gave faster tempo ratings than from the audio alone or when viewing the same audio with a relaxed dancing figure. The implications of this illusory tempo percept for cross-modal sensory integration and working memory are discussed, and an "energistic" account of tempo perception is proposed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.