The urge to move in response to music, combined with the positive affect associated with the coupling of sensory and motor processes while engaging with music (referred to as sensorimotor coupling) in a seemingly effortless way, is commonly described as the feeling of being in the groove. Here, we systematically explore this compelling phenomenon in a population of young adults. We utilize multiple levels of analysis, comprising phenomenological, behavioral, and computational techniques. Specifically, we show (a) that the concept of the groove is widely appreciated and understood in terms of a pleasurable drive toward action, (b) that a broad range of musical excerpts can be appraised reliably for the degree of perceived groove, (c) that the degree of experienced groove is inversely related to experienced difficulty of bimanual sensorimotor coupling under tapping regimes with varying levels of expressive constraint, (d) that high-groove stimuli elicit spontaneous rhythmic movements, and (e) that quantifiable measures of the quality of sensorimotor coupling predict the degree of experienced groove. Our results complement traditional discourse regarding the groove, which has tended to take the psychological phenomenon for granted and has focused instead on the musical and especially the rhythmic qualities of particular genres of music that lead to the perception of groove. We conclude that groove can be treated as a psychological construct and model system that allows for experimental exploration of the relationship between sensorimotor coupling with music and emotion.
Participants listened to randomly selected excerpts of popular music and rated how nostalgic each song made them feel. Nostalgia was stronger to the extent that a song was autobiographically salient, arousing, familiar, and elicited a greater number of positive, negative, and mixed emotions. These effects were moderated by individual differences (nostalgia proneness, mood state, dimensions of the Affective Neurosciences Personality Scale, and factors of the Big Five Inventory). Nostalgia proneness predicted stronger nostalgic experiences, even after controlling for other individual difference measures. Nostalgia proneness was predicted by the Sadness dimension of the Affective Neurosciences Personality Scale and Neuroticism of the Big Five Inventory. Nostalgia was associated with both joy and sadness, whereas nonnostalgic and nonautobiographical experiences were associated with irritation.
Despite music's prominence in Western society and its importance to individuals in their daily lives, very little is known about the memories and emotions that are often evoked when hearing a piece of music from one's past. We examined the content of music-evoked autobiographical memories (MEAMs) using a novel approach for selecting stimuli from a large corpus of popular music, in both laboratory and online settings. A set of questionnaires probed the cognitive and affective properties of the evoked memories. On average, 30% of the song presentations evoked autobiographical memories, and the majority of songs also evoked various emotions, primarily positive, that were felt strongly. The third most common emotion was nostalgia. Analyses of written memory reports found both general and specific levels of autobiographical knowledge to be represented, and several social and situational contexts for memory formation were common across many memories. The findings indicate that excerpts of popular music serve as potent stimuli for studying the structure of autobiographical memories.
Western tonal music relies on a formal geometric structure that determines distance relationships within a harmonic or tonal space. In functional magnetic resonance imaging experiments, we identified an area in the rostromedial prefrontal cortex that tracks activation in tonal space. Different voxels in this area exhibited selectivity for different keys. Within the same set of consistently activated voxels, the topography of tonality selectivity rearranged itself across scanning sessions. The tonality structure was thus maintained as a dynamic topography in cortical areas known to be at a nexus of cognitive, affective, and mnemonic processing.
The medial prefrontal cortex (MPFC) is regarded as a region of the brain that supports self-referential processes, including the integration of sensory information with self-knowledge and the retrieval of autobiographical information. I used functional magnetic resonance imaging and a novel procedure for eliciting autobiographical memories with excerpts of popular music dating to one's extended childhood to test the hypothesis that music and autobiographical memories are integrated in the MPFC. Dorsal regions of the MPFC (Brodmann area 8/9) were shown to respond parametrically to the degree of autobiographical salience experienced over the course of individual 30 s excerpts. Moreover, the dorsal MPFC also responded on a second, faster timescale corresponding to the signature movements of the musical excerpts through tonal space. These results suggest that the dorsal MPFC associates music and memories when we experience emotionally salient episodic memories that are triggered by familiar songs from our personal past. MPFC acted in concert with lateral prefrontal and posterior cortices both in terms of tonality tracking and overall responsiveness to familiar and autobiographically salient songs. These findings extend the results of previous autobiographical memory research by demonstrating the spontaneous activation of an autobiographical memory network in a naturalistic task with low retrieval demands.
Music consists of precisely patterned sequences of both movement and sound that engage the mind in a multitude of experiences. We move in response to music and we move in order to make music. Because of the intimate coupling between perception and action, music provides a panoramic window through which we can examine the neural organization of complex behaviors that are at the core of human nature. Although the cognitive neuroscience of music is still in its infancy, a considerable behavioral and neuroimaging literature has amassed that pertains to neural mechanisms that underlie musical experience. Here we review neuroimaging studies of explicit sequence learning and temporal production--findings that ultimately lay the groundwork for understanding how more complex musical sequences are represented and produced by the brain. These studies are also brought into an existing framework concerning the interaction of attention and time-keeping mechanisms in perceiving complex patterns of information that are distributed in time, such as those that occur in music.
Polyphonic music combines multiple auditory streams to create complex auditory scenes, thus providing a tool for investigating the neural mechanisms that orient attention in natural auditory contexts. Across two fMRI experiments, we varied stimuli and task demands in order to identify the cortical areas that are activated during attentive listening to real music. In individual experiments and in a conjunction analysis of the two experiments, we found bilateral blood oxygen level dependent (BOLD) signal increases in temporal (the superior temporal gyrus), parietal (the intraparietal sulcus), and frontal (the precentral sulcus, the inferior frontal sulcus and gyrus, and the frontal operculum) areas during selective and global listening, as compared with passive rest without musical stimulation. Direct comparisons of the listening conditions showed significant differences between attending to single timbres (instruments) and attending across multiple instruments, although the patterns that were observed depended on the relative demands of the tasks being compared. The overall pattern of BOLD signal increases indicated that attentive listening to music recruits neural circuits underlying multiple forms of working memory, attention, semantic processing, target detection, and motor imagery. Thus, attentive listening to music appears to be enabled by areas that serve general functions, rather than by music-specific cortical modules.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.