2022
DOI: 10.1101/2022.11.27.518089
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Parametric information about eye movements is sent to the ears

Abstract: Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
26
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 7 publications
(30 citation statements)
references
References 74 publications
4
26
0
Order By: Relevance
“…The latencies of the first two peaks of the EMREOs occurred after 14.9 ± 3.4 ms and 21.0 ± 7.9 ms (mean ± SD). The vertical saccades (performed only in experiment 1) reveal a similar albeit less clear relationship between EMREO phase and saccade direction, as expected based on previous reports (Gruters et al, 2018;Murphy et al, 2020;Lovich et al, 2022). The scaling of the EMREO with saccade direction and amplitude could pertain to the instructed saccade vector that was indicated on the screen, or could directly pertain to the actually executed movement.…”
Section: Emreos Are Shaped By Saccade Target Rather Than Saccade Vari...supporting
confidence: 79%
See 2 more Smart Citations
“…The latencies of the first two peaks of the EMREOs occurred after 14.9 ± 3.4 ms and 21.0 ± 7.9 ms (mean ± SD). The vertical saccades (performed only in experiment 1) reveal a similar albeit less clear relationship between EMREO phase and saccade direction, as expected based on previous reports (Gruters et al, 2018;Murphy et al, 2020;Lovich et al, 2022). The scaling of the EMREO with saccade direction and amplitude could pertain to the instructed saccade vector that was indicated on the screen, or could directly pertain to the actually executed movement.…”
Section: Emreos Are Shaped By Saccade Target Rather Than Saccade Vari...supporting
confidence: 79%
“…The eye-movement related eardrum oscillations (EMREOs) may be a signature of such an active hearing process. However, the precise neurobiological substrate generating the EMREO and the functional implications of these for hearing remain unclear (Gruters et al, 2018;Murphy et al, 2020;Lovich et al, 2022;King et al, 2023). Our data corroborate that the eardrum moves consistently and specifically in relation to saccadic eye movements.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In complex, naturalistic environments, eye movements could aid the auditory system in unraveling the vastly overlapping spectrotemporal information that reaches the ears. Recent evidence in humans suggests that eye movements contribute to the computation of sound locations in relation to the visual scene at the very first stages of sound processing (Lovich et al, 2022; Murphy et al, 2022). Similar studies with monkeys and cats suggest a midbrain hub of inferior and superior colliculus (IC, SC) that affects auditory processing based on eye positions (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…Bulkin & Groh, 2012; Lee & Groh, 2012; Porter et al, 2007). This circuit has recently been extended to the auditory periphery in humans (Lovich et al, 2022; Murphy et al, 2022). Accordingly, several studies in humans point towards interactions between eye movements and auditory cognition in sound localization (Getzmann, 2002), spatial discrimination (Maddox et al, 2014), and spatial attention (Pomper & Chait, 2017) with lateralized engagement of the posterior parietal cortex in unison with lateralized gaze direction (Popov et al, 2022).…”
Section: Introductionmentioning
confidence: 99%