Online experimental platforms can be used as an alternative to, or complement, lab-based research. However, when conducting auditory experiments via online methods, the researcher has limited control over the participants’ listening environment. We offer a new method to probe one aspect of that environment, headphone use. Headphones not only provide better control of sound presentation but can also “shield” the listener from background noise. Here we present a rapid (< 3 min) headphone screening test based on Huggins Pitch (HP), a perceptual phenomenon that can only be detected when stimuli are presented dichotically. We validate this test using a cohort of “Trusted” online participants who completed the test using both headphones and loudspeakers. The same participants were also used to test an existing headphone test (AP test; Woods et al., 2017, Attention Perception Psychophysics). We demonstrate that compared to the AP test, the HP test has a higher selectivity for headphone users, rendering it as a compelling alternative to existing methods. Overall, the new HP test correctly detects 80% of headphone users and has a false-positive rate of 20%. Moreover, we demonstrate that combining the HP test with an additional test–either the AP test or an alternative based on a beat test (BT)–can lower the false-positive rate to ~ 7%. This should be useful in situations where headphone use is particularly critical (e.g., dichotic or spatial manipulations). Code for implementing the new tests is publicly available in JavaScript and through Gorilla (gorilla.sc).
The ability to track the statistics of our surroundings is a key computational challenge. A prominent theory proposes that the brain monitors for unexpected uncertainty – events which deviate substantially from model predictions, indicating model failure. Norepinephrine is thought to play a key role in this process by serving as an interrupt signal, initiating model-resetting. However, evidence is from paradigms where participants actively monitored stimulus statistics. To determine whether Norepinephrine routinely reports the statistical structure of our surroundings, even when not behaviourally relevant, we used rapid tone-pip sequences that contained salient pattern-changes associated with abrupt structural violations vs. emergence of regular structure. Phasic pupil dilations (PDR) were monitored to assess Norepinephrine. We reveal a remarkable specificity: When not behaviourally relevant, only abrupt structural violations evoke a PDR. The results demonstrate that Norepinephrine tracks unexpected uncertainty on rapid time scales relevant to sensory signals.
Recent studies indicate that COVID-19 infection can lead to serious neurological consequences in a small percentage of individuals. However, in the months following acute illness, many more suffer from fatigue, low motivation, disturbed mood, poor sleep and cognitive symptoms, colloquially referred to as ‘brain fog’. But what about individuals who had asymptomatic to moderate COVID-19 and reported no concerns after recovering from COVID-19? Here, we examined a wide range of cognitive functions critical for daily life (including sustained attention, memory, motor control, planning, semantic reasoning, mental rotation and spatial–visual attention) in people who had previously suffered from COVID-19 but were not significantly different from a control group on self-reported fatigue, forgetfulness, sleep abnormality, motivation, depression, anxiety and personality profile. Reassuringly, COVID-19 survivors performed well in most abilities tested, including working memory, executive function, planning and mental rotation. However, they displayed significantly worse episodic memory (up to 6 months post-infection) and greater decline in vigilance with time on task (for up to 9 months). Overall, the results show that specific chronic cognitive changes following COVID-19 are evident on objective testing even amongst those who do not report a greater symptom burden. Importantly, in the sample tested here, these were not significantly different from normal after 6–9 months, demonstrating evidence of recovery over time.
The ability to sustain attention on a task-relevant sound source while avoiding distraction from concurrent sounds is fundamental to listening in crowded environments. We aimed to (a) devise an experimental paradigm with which this aspect of listening can be isolated and (b) evaluate the applicability of pupillometry as an objective measure of sustained attention in young and older populations. We designed a paradigm that continuously measured behavioral responses and pupillometry during 25-s trials. Stimuli contained a number of concurrent, spectrally distinct tone streams. On each trial, participants detected gaps in one of the streams while resisting distraction from the others. Behavior demonstrated increasing difficulty with time-on-task and with number/proximity of distractor streams. In young listeners (N ¼ 20; aged 18 to 35 years), pupil diameter (on the group and individual level) was dynamically modulated by instantaneous task difficulty: Periods where behavioral performance revealed a strain on sustained attention were accompanied by increased pupil diameter. Only trials on which participants performed successfully were included in the pupillometry analysis so that the observed effects reflect task demands as opposed to failure to attend. In line with existing reports, we observed global changes to pupil dynamics in the older group (N ¼ 19; aged 63 to 79 years) including decreased pupil diameter, limited dilation range, and reduced temporal variability. However, despite these changes, older listeners showed similar effects of attentive tracking to those observed in the young listeners. Overall, our results demonstrate that pupillometry can be a reliable and time-sensitive measure of attentive tracking over long durations in both young and (with caveats) older listeners.
Online experimental platforms can be used as an alternative, or complement, to lab-based research. However, when conducting auditory experiments via online methods, the researcher has limited control over the participants’ listening environment. We offer a new method to probe one aspect of that environment, headphone use. Headphones not only provide better control of sound presentation but can also “shield” the listener from background noise. Here we present a rapid (< 3 minute) headphone screening test based on Huggins Pitch (HP), a perceptual phenomenon that can only be detected when stimuli are presented dichotically. We validate this test using a cohort of “Trusted” online participants who completed the test using both headphones and loudspeakers. The same participants also trialled a widely used headphone test (AP test; Woods et al., 2017). We demonstrate that compared to the AP test, the HP test has a higher selectively for headphone users, rendering it as a compelling alternative to the existing screening method. Overall, the new HP test correctly detects 80% of headphone users and has a false positive rate of 20%. Moreover, we demonstrate that there is little overlap between participants who pass both HP and AP tests over loudspeakers. Therefore, combining the two tests can lower the false positive rate to ~7% (but at the expense of an increased false negative rate). This should be useful in situations where headphone use is particularly critical (e.g. dichotic or spatial manipulations). An implementation of the new test is available with JavaScript and through Gorilla (gorilla.sc).
Despite the prevalent use of alerting sounds in alarms and human–machine interface systems and the long-hypothesized role of the auditory system as the brain's “early warning system,” we have only a rudimentary understanding of what determines auditory salience—the automatic attraction of attention by sound—and which brain mechanisms underlie this process. A major roadblock has been the lack of a robust, objective means of quantifying sound-driven attentional capture. Here we demonstrate that: (1) a reliable salience scale can be obtained from crowd-sourcing (N = 911), (2) acoustic roughness appears to be a driving feature behind this scaling, consistent with previous reports implicating roughness in the perceptual distinctiveness of sounds, and (3) crowd-sourced auditory salience correlates with objective autonomic measures. Specifically, we show that a salience ranking obtained from online raters correlated robustly with the superior colliculus-mediated ocular freezing response, microsaccadic inhibition (MSI), measured in naive, passively listening human participants (of either sex). More salient sounds evoked earlier and larger MSI, consistent with a faster orienting response. These results are consistent with the hypothesis that MSI reflects a general reorienting response that is evoked by potentially behaviorally important events regardless of their modality.SIGNIFICANCE STATEMENT Microsaccades are small, rapid, fixational eye movements that are measurable with sensitive eye-tracking equipment. We reveal a novel, robust link between microsaccade dynamics and the subjective salience of brief sounds (salience rankings obtained from a large number of participants in an online experiment): Within 300 ms of sound onset, the eyes of naive, passively listening participants demonstrate different microsaccade patterns as a function of the sound's crowd-sourced salience. These results position the superior colliculus (hypothesized to underlie microsaccade generation) as an important brain area to investigate in the context of a putative multimodal salience hub. They also demonstrate an objective means for quantifying auditory salience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.