SummaryData analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, meta-analytic approaches that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed.
Although listeners use both auditory and visual cues during speech perception, the cognitive and neural bases for their integration remain a matter of debate. One common approach to measuring multisensory integration is to use McGurk tasks, in which discrepant auditory and visual cues produce auditory percepts that differ from those based solely on unimodal input. Not all listeners show the same degree of susceptibility to the McGurk illusion, and these individual differences in susceptibility are frequently used as a measure of audiovisual integration ability. However, despite their popularity, we argue that McGurk tasks are ill-suited for studying the kind of multisensory speech perception that occurs in real life: McGurk stimuli are often based on isolated syllables (which are rare in conversations) and necessarily rely on audiovisual incongruence that does not occur naturally. Furthermore, recent data show that susceptibility on McGurk tasks does not correlate with performance during natural audiovisual speech perception. Although the McGurk effect is a fascinating illusion, truly understanding the combined use of auditory and visual information during speech perception requires tasks that more closely resemble everyday communication.
Understanding spoken language requires transmission of the acoustic signal up the ascending auditory pathway. However, in many cases speech understanding also relies on cognitive processes that act on the acoustic signal. One area in which cognitive processing is particularly striking during speech comprehension is when the acoustic signal is made less challenging, which might happen due to background noise, talker characteristics, or hearing loss. This chapter focuses on the interaction between hearing and cognition in hearing loss in aging. The chapter begins with a review of common age-related changes in hearing and cognition, followed by summary evidence from behavioral, pupillometric, and neuroimaging paradigms that elucidate the interplay between hearing ability and cognition. Across a variety of experimental paradigms, there is compelling evidence that when listeners process acoustically challenging speech, additional cognitive processing is required compared to acoustically clear speech. This increase in cognitive processing is associated with specific brain networks, with the clearest evidence implicating the cingulo-opercular and executive attention networks and prefrontal cortex. Individual differences in hearing and cognitive ability thus determine the cognitive demand faced by a particular listener, and the cognitive and neural resources needed to aid in speech perception.
Changes in sensory systems are common as we get older, and become more likely with increasing age. In the auditory system, age-related changes are seen in domains such as auditory sensitivity, temporal processing, and spatial localization, which have significant effects on speech understanding. In vision, age related changes are seen in contrast sensitivity, scotopic processing, and visual processing speed, which have consequences for activities such as reading and driving. In addition to hearing and vision, aging is associated with changes in smell, taste, and balance. Beyond simple perceptual processing, age-related sensory changes can increase cognitive demands, requiring greater involvement of domain-general cognitive processes during perception that reduce resources available for other operations. Capturing individual variability in sensory changes and their consequences is an important part of understanding normal and pathological aging.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.