Many behavioral measures of visual perception fluctuate continually in a rhythmic manner, reflecting the influence of endogenous brain oscillations, particularly theta (∼4-7 Hz) and alpha (∼8-12 Hz) rhythms [1-3]. However, it is unclear whether these oscillations are unique to vision or whether auditory performance also oscillates [4, 5]. Several studies report no oscillatory modulation in audition [6, 7], while those with positive findings suffer from confounds relating to neural entrainment [8-10]. Here, we used a bilateral pitch-identification task to investigate rhythmic fluctuations in auditory performance separately for the two ears and applied signal detection theory (SDT) to test for oscillations of both sensitivity and criterion (changes in decision boundary) [11, 12]. Using uncorrelated dichotic white noise to induce a phase reset of oscillations, we demonstrate that, as with vision, both auditory sensitivity and criterion showed strong oscillations over time, at different frequencies: ∼6 Hz (theta range) for sensitivity and ∼8 Hz (low alpha range) for criterion, implying distinct underlying sampling mechanisms [13]. The modulation in sensitivity in left and right ears was in antiphase, suggestive of attention-like mechanisms sampling alternatively from the two ears.
Recent work from several groups has shown that perception of various visual attributes in human observers at a given moment is biased toward what was recently seen. This positive serial dependency is a kind of temporal averaging that exploits short-term correlations in visual scenes to reduce noise and stabilize perception. To date, this stabilizing "continuity field" has been demonstrated on stable visual attributes such as orientation and face identity, yet it would be counterproductive to apply it to dynamic attributes in which change sensitivity is needed. Here, we tested this using motion direction discrimination and predict a negative perceptual dependency: a contrastive relationship that enhances sensitivity to change. Surprisingly, our data showed a cubic-like pattern of dependencies with positive and negative components. By interleaving various stimulus combinations, we separated the components and isolated a positive perceptual dependency for motion and a negative dependency for orientation. A weighted linear sum of the separate dependencies described the original cubic pattern well. The positive dependency for motion shows an integrative perceptual effect and was unexpected, although it is consistent with work on motion priming. These findings suggest that a perception-stabilizing continuity field occurs pervasively, occurring even when it obscures sensitivity to dynamic stimuli.
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Studies of spatial perception during visual saccades have demonstrated compressions of visual space around the saccade target. Here we psychophysically investigated perception of auditory space during rapid head turns, focusing on the ''perisaccadic'' interval. Using separate perceptual and behavioral response measures we show that spatial compression also occurs for rapid head movements, with the auditory spatial representation compressing by up to 50%. Similar to observations in the visual system, this occurred only when spatial locations were measured by using a perceptual response; it was absent for the behavioral measure involving a nose-pointing task. These findings parallel those observed in vision during saccades and suggest that a common neural mechanism may subserve these distortions of space in each modality.action and perception ͉ auditory localization ͉ head motion ͉ saccades ͉ spatial perception
Thus, the LDN group showed deficits in attention switching and inhibitory control, whereas only a subset of these participants demonstrated an additional frequency resolution deficit.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.