Musical emotions, such as happiness and sadness, have been investigated using instrumental music devoid of linguistic content. However, pop and rock, the most common musical genres, utilize lyrics for conveying emotions. Using participants’ self-selected musical excerpts, we studied their behavior and brain responses to elucidate how lyrics interact with musical emotion processing, as reflected by emotion recognition and activation of limbic areas involved in affective experience. We extracted samples from subjects’ selections of sad and happy pieces and sorted them according to the presence of lyrics. Acoustic feature analysis showed that music with lyrics differed from music without lyrics in spectral centroid, a feature related to perceptual brightness, whereas sad music with lyrics did not diverge from happy music without lyrics, indicating the role of other factors in emotion classification. Behavioral ratings revealed that happy music without lyrics induced stronger positive emotions than happy music with lyrics. We also acquired functional magnetic resonance imaging data while subjects performed affective tasks regarding the music. First, using ecological and acoustically variable stimuli, we broadened previous findings about the brain processing of musical emotions and of songs versus instrumental music. Additionally, contrasts between sad music with versus without lyrics recruited the parahippocampal gyrus, the amygdala, the claustrum, the putamen, the precentral gyrus, the medial and inferior frontal gyri (including Broca’s area), and the auditory cortex, while the reverse contrast produced no activations. Happy music without lyrics activated structures of the limbic system and the right pars opercularis of the inferior frontal gyrus, whereas auditory regions alone responded to happy music with lyrics. These findings point to the role of acoustic cues for the experience of happiness in music and to the importance of lyrics for sad musical emotions.
Music is a common means for regulating affective states in everyday life, but little is known about the individual differences in this behaviour. We investigated affective reactions to musical stimuli as an explanatory factor. Forty-four young adults rated self-selected music regarding perceived and felt emotions, preference, pleasantness and beauty. The ratings were reduced into five factors representing affective response tendencies. The participants also filled in the Music in Mood Regulation (MMR) questionnaire assessing seven music-related mood regulation strategies in everyday life. High beauty and pleasantness ratings for liked music correlated with the use of music for inducing strong emotional experiences, while ratings reflecting high agreement with the emotional content of preferred musical stimuli correlated with using music as a means for dealing with personal negative emotions. Regarding musical background, informal engagement through listening, but not formal musical training, correlated with increased use of music for mood regulation. The results clarify the link between the affective reactivity to music and the individual ways of using music as a tool for emotional self-regulation in everyday life.
From an early age, children are attracted to the aesthetics of music. Employing a cross-sectional design including school-aged children, the present exploratory study aimed to investigate the effects of age, gender, and music education on three important aspects of the aesthetic experience of music: musical preference, musical emotion recognition, and the use of the aesthetic categories for music. To this aim, we developed an experimental procedure suitable to quantify children's musical preferences and their judgment of musical emotions and aesthetics. The musical material consisted of three short piano pieces: a piece in major mode, a piece in minor mode, and a free tonal piece. The responses of 78 children were analyzed, whereby the children were assigned to two age groups: 6-7-year-olds (n = 38) and 8-9-yearolds (n = 40). Children preferred the piece in major mode to the one in minor. Except for 6-7-year-olds without music education, children gave the highest happiness ratings for the major piece. Only 8-9-yearolds found the minor piece sadder than the major piece, and the major piece more beautiful than the piece in minor. The ratings of the free tonal piece were mostly indifferent and probably reflect children's difficulty in judging music that does not yet belong to their short musical history. Taken together, the current data imply that school-aged children are able to make emotional and aesthetic judgments about unfamiliar musical pieces.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.