Abstract:Musicians are highly trained to discriminate fine pitch changes but the neural bases of this ability are poorly understood. It is unclear whether such training-dependent differences in pitch processing arise already in the subcortical auditory system or are linked to more central stages. To address this question, we combined psychoacoustic testing with functional MRI to measure cortical and subcortical responses in musicians and non-musicians during a pitch-discrimination task. First, we estimated behavioral p… Show more
“…For example, when playing piano, pressing a key to produce a certain pitch will, over time with practice, develop key-to-pitch mapping (Maes et al, 2014). Years of musical training enhance auditory mechanisms related to sub/cortical areas, not only in response to music, such as pitch perception (Kishon-Rabin et al, 2001;Schön et al, 2004;Zatorre et al, 2007;Barnett et al, 2017;Bianchi et al, 2017), but also to other AV events such as speech (Patel, 2011). With this basis, in the present study, the role of previous musical experience will be examined for N1 and P2 in AV music perception during which visual cues from finger and hand movement can offer prediction for the corresponding sound.…”
In audiovisual music perception, visual information from a musical instrument being played is available prior to the onset of the corresponding musical sound and consequently allows a perceiver to form a prediction about the upcoming audio music. This prediction in audiovisual music perception, compared to auditory music perception, leads to lower N1 and P2 amplitudes and latencies. Although previous research suggests that audiovisual experience, such as previous musical experience may enhance this prediction, a remaining question is to what extent musical experience modifies N1 and P2 amplitudes and latencies. Furthermore, corresponding event-related phase modulations quantified as inter-trial phase coherence (ITPC) have not previously been reported for audiovisual music perception. In the current study, audio video recordings of a keyboard key being played were presented to musicians and non-musicians in audio only (AO), video only (VO), and audiovisual (AV) conditions. With predictive movements from playing the keyboard isolated from AV music perception (AV-VO), the current findings demonstrated that, compared to the AO condition, both groups had a similar decrease in N1 amplitude and latency, and P2 amplitude, along with correspondingly lower ITPC values in the delta, theta, and alpha frequency bands. However, while musicians showed lower ITPC values in the beta-band in AV-VO compared to the AO, non-musicians did not show this pattern. Findings indicate that AV perception may be broadly correlated with auditory perception, and differences between musicians and non-musicians further indicate musical experience to be a specific factor influencing AV perception. Predicting an upcoming sound in AV music perception may involve visual predictory processes, as well as beta-band oscillations, which may be influenced by years of musical training. This study highlights possible interconnectivity in AV perception as well as potential modulation with experience.
“…For example, when playing piano, pressing a key to produce a certain pitch will, over time with practice, develop key-to-pitch mapping (Maes et al, 2014). Years of musical training enhance auditory mechanisms related to sub/cortical areas, not only in response to music, such as pitch perception (Kishon-Rabin et al, 2001;Schön et al, 2004;Zatorre et al, 2007;Barnett et al, 2017;Bianchi et al, 2017), but also to other AV events such as speech (Patel, 2011). With this basis, in the present study, the role of previous musical experience will be examined for N1 and P2 in AV music perception during which visual cues from finger and hand movement can offer prediction for the corresponding sound.…”
In audiovisual music perception, visual information from a musical instrument being played is available prior to the onset of the corresponding musical sound and consequently allows a perceiver to form a prediction about the upcoming audio music. This prediction in audiovisual music perception, compared to auditory music perception, leads to lower N1 and P2 amplitudes and latencies. Although previous research suggests that audiovisual experience, such as previous musical experience may enhance this prediction, a remaining question is to what extent musical experience modifies N1 and P2 amplitudes and latencies. Furthermore, corresponding event-related phase modulations quantified as inter-trial phase coherence (ITPC) have not previously been reported for audiovisual music perception. In the current study, audio video recordings of a keyboard key being played were presented to musicians and non-musicians in audio only (AO), video only (VO), and audiovisual (AV) conditions. With predictive movements from playing the keyboard isolated from AV music perception (AV-VO), the current findings demonstrated that, compared to the AO condition, both groups had a similar decrease in N1 amplitude and latency, and P2 amplitude, along with correspondingly lower ITPC values in the delta, theta, and alpha frequency bands. However, while musicians showed lower ITPC values in the beta-band in AV-VO compared to the AO, non-musicians did not show this pattern. Findings indicate that AV perception may be broadly correlated with auditory perception, and differences between musicians and non-musicians further indicate musical experience to be a specific factor influencing AV perception. Predicting an upcoming sound in AV music perception may involve visual predictory processes, as well as beta-band oscillations, which may be influenced by years of musical training. This study highlights possible interconnectivity in AV perception as well as potential modulation with experience.
“…While it has been shown that processing effort increases with increasing the processing demand of the listening condition for speech (Johnsrude and Rodd 2015 ), to the knowledge of the authors, this is the first study to investigate pupil dilation during a pitch discrimination task with varying harmonic resolvability and task difficulty. While in a previous study (Bianchi et al 2014 ), pupil dilations were measured for conditions with concomitantly varying harmonic resolvability and task difficulty, a new experimental design was used here to disentangle the effects of resolvability and task difficulty on pupil dilations. In experiment 2, pitch discrimination thresholds were measured behaviorally at three F0s (i.e., three levels of resolvability) and at three different points of the psychometric function (i.e., three levels of task difficulty).…”
Musicians typically show enhanced pitch discrimination abilities compared to non-musicians. The present study investigated this perceptual enhancement behaviorally and objectively for resolved and unresolved complex tones to clarify whether the enhanced performance in musicians can be ascribed to increased peripheral frequency selectivity and/or to a different processing effort in performing the task. In a first experiment, pitch discrimination thresholds were obtained for harmonic complex tones with fundamental frequencies (F0s) between 100 and 500 Hz, filtered in either a low- or a high-frequency region, leading to variations in the resolvability of audible harmonics. The results showed that pitch discrimination performance in musicians was enhanced for resolved and unresolved complexes to a similar extent. Additionally, the harmonics became resolved at a similar F0 in musicians and non-musicians, suggesting similar peripheral frequency selectivity in the two groups of listeners. In a follow-up experiment, listeners’ pupil dilations were measured as an indicator of the required effort in performing the same pitch discrimination task for conditions of varying resolvability and task difficulty. Pupillometry responses indicated a lower processing effort in the musicians versus the non-musicians, although the processing demand imposed by the pitch discrimination task was individually adjusted according to the behavioral thresholds. Overall, these findings indicate that the enhanced pitch discrimination abilities in musicians are unlikely to be related to higher peripheral frequency selectivity and may suggest an enhanced pitch representation at more central stages of the auditory system in musically trained listeners.
“…Although there are dedicated pitch perception areas in the cerebral cortex (Patterson et al, 2002;Puschmann et al, 2010;De Angelis et al, 2018), the IC has also been associated with the representation of pitch Chandrasekaran et al, 2012;Bianchi et al, 2017; reviewed in Gruters and Groh, 2012;Pannese et al, 2015). Thus, the IC might be a candidate structure for explaining altered vocal pitch perception in ASD.…”
Autism spectrum disorder (ASD) is a clinical condition that is associated with deficient processing of communication signals. To-date, most neuroscience research on ASD focuses on explaining these symptoms at the cerebral cortex level or in limbic structures. This research implicitly or explicitly assumes that the subcortical sensory pathways are intact in ASD. To date, however, the integrity of subcortical sensory pathway nuclei in ASD has never been investigated in humans in vivo. Here, we assessed the functional integrity of auditory sensory pathway nuclei in ASD in three independent functional magnetic resonance imaging (fMRI) experiments. We focused on two aspects of auditory communication that are impaired in ASD: voice identity perception, and recognising speech-in noise. Adults with ASD (n = 16 in experiments 1 and 2 and n = 17 in another sample in experiment 3) and typically developed control groups were pairwise matched on sex, age, handedness, and full-scale IQ. We found reduced processing in the ASD as compared to the control groups in the central midbrain structure of the auditory pathway (inferior colliculus, IC; all results FWE corrected and Bonferroni corrected for the number of regions tested). The right IC responded less in the ASD as compared to the control group for voice identity recognition, in contrast to a speech recognition task. The right IC also responded less in the ASD as compared to the control group when passively listening to vocal sounds in contrast to non-vocal sounds. For speech-in-noise recognition, we found that within the control group, the left and right IC responded more when performing a speech-in-noise recognition task as compared to when performing a speech recognition task without additional noise. In the ASD group, this was only the case in the left, but not the right IC. There was no interaction between noise and group. The results show that communication signal processing in ASD is associated with reduced subcortical sensory functioning in the midbrain. The results highlight the importance of considering sensory processing alterations in explaining social communication difficulties, which are at the core of ASD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.