Highlights d Musical pleasure depends on prospective and retrospective states of expectation d A machine-learning model quantified the uncertainty and surprise of pop song chords d Chords with low uncertainty and high surprise, and vice versa, evoked high pleasure d Joint effects of uncertainty and surprise found in the amygdala and auditory cortex
Complex auditory sequences known as music have often been described as hierarchically structured. This permits the existence of non-local dependencies, which relate elements of a sequence beyond their temporal sequential order. Previous studies in music have reported differential activity in the inferior frontal gyrus (IFG) when comparing regular and irregular chord-transitions based on theories in Western tonal harmony. However, it is unclear if the observed activity reflects the interpretation of hierarchical structure as the effects are confounded by local irregularity. Using functional magnetic resonance imaging (fMRI), we found that violations to non-local dependencies in nested sequences of three-tone musical motifs in musicians elicited increased activity in the right IFG. This is in contrast to similar studies in language which typically report the left IFG in processing grammatical syntax. Effects of increasing auditory working demands are moreover reflected by distributed activity in frontal and parietal regions. Our study therefore demonstrates the role of the right IFG in processing non-local dependencies in music, and suggests that hierarchical processing in different cognitive domains relies on similar mechanisms that are subserved by domain-selective neuronal subpopulations.
Neurobiological models of emotion focus traditionally on limbic/paralimbic regions as neural substrates of emotion generation, and insular cortex (in conjunction with isocortical anterior cingulate cortex, ACC) as the neural substrate of feelings. An emerging view, however, highlights the importance of isocortical regions beyond insula and ACC for the subjective feeling of emotions. We used music to evoke feelings of joy and fear, and multivariate pattern analysis (MVPA) to decode representations of feeling states in functional magnetic resonance (fMRI) data of n = 24 participants. Most of the brain regions providing information about feeling representations were neocortical regions. These included, in addition to granular insula and cingulate cortex, primary and secondary somatosensory cortex, premotor cortex, frontal operculum, and auditory cortex. The multivoxel activity patterns corresponding to feeling representations emerged within a few seconds, gained in strength with increasing stimulus duration, and replicated results of a hypothesis-generating decoding analysis from an independent experiment. Our results indicate that several neocortical regions (including insula, cingulate, somatosensory and premotor cortices) are important for the generation and modulation of feeling states. We propose that secondary somatosensory cortex, which covers the parietal operculum and encroaches on the posterior insula, is of particular importance for the encoding of emotion percepts, i.e., preverbal representations of subjective feeling.
Semantic knowledge is central to human cognition. The angular gyrus (AG) is widely considered a key brain region for semantic cognition. However, the role of the AG in semantic processing is controversial. Key controversies concern response polarity (activation vs. deactivation) and its relation to task difficulty, lateralization (left vs. right AG), and functional–anatomical subdivision (PGa vs. PGp subregions). Here, we combined the fMRI data of five studies on semantic processing (n = 172) and analyzed the response profiles from the same anatomical regions-of-interest for left and right PGa and PGp. We found that the AG was consistently deactivated during non-semantic conditions, whereas response polarity during semantic conditions was inconsistent. However, the AG consistently showed relative response differences between semantic and non-semantic conditions, and between different semantic conditions. A combined analysis across all studies revealed that AG responses could be best explained by separable effects of task difficulty and semantic processing demand. Task difficulty effects were stronger in PGa than PGp, regardless of hemisphere. Semantic effects were stronger in left than right AG, regardless of subregion. These results suggest that the AG is engaged in both domain-general task-difficulty-related processes and domain-specific semantic processes. In semantic processing, we propose that left AG acts as a “multimodal convergence zone” that binds different semantic features associated with the same concept, enabling efficient access to task-relevant features.
Semantic knowledge is central to human cognition. The angular gyrus (AG) is widely considered a key brain region for semantic cognition. However, the role of the AG in semantic processing is controversial. Key controversies concern response polarity (activation vs. deactivation) and its relation to task difficulty, lateralization (left vs. right AG), and functional-anatomical subdivision (PGa vs. PGp subregions). Here, we combined the fMRI data of five studies on semantic processing (n = 172) and analyzed the response profiles from the same anatomical regions-of-interest for left and right PGa and PGp. We found that the AG was consistently deactivated during non-semantic conditions, whereas response polarity during semantic conditions was inconsistent. However, the AG consistently showed relative response differences between semantic and non-semantic conditions, and between different semantic conditions. A combined analysis across all studies revealed that AG responses could be best explained by independent effects of both task difficulty and semantic processing demand. Task difficulty effects were stronger in PGa than PGp, regardless of hemisphere. Semantic effects were stronger in left than right AG, regardless of subregion. These results suggest that the AG is independently engaged in both domain-general task-difficulty-related processes and domain-specific semantic processes. In semantic processing, we propose that left AG acts as a “multimodal convergence zone” that binds different semantic features associated with the same concept, enabling efficient access to task-relevant features.
Expectation is crucial for our enjoyment of music, yet the underlying generative mechanism remains contested. While sensory–acoustic models derive predictions based on the short-term auditory input alone, cognitive models assume the use of abstract knowledge of music structure acquired over the long-term. To evaluate these two contrasting mechanisms, we compared simulations from computational models of musical expectancy against subjective surprise ratings of chords sampled from US Billboard pop songs in musicians and non-musicians. Bayesian model comparison revealed that probabilistic knowledge of music structure and auditory short-term memory both explained unique behavioural variance without mediation. However, probabilistic knowledge accounted for nearly four times as much variance in musicians, and over twice as much in non-musicians. Incorporating both probabilistic knowledge and auditory short-term memory together furthermore improved predictive accuracy over the individual models. Our findings thus motivate an alternative to the current debate by emphasising the distinct, albeit complementary, roles of cognitive and sensory information in forming expectations during music-listening in humans.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.