2017
DOI: 10.1002/hbm.23515
|View full text |Cite
|
Sign up to set email alerts
|

Auditory and audio–visual processing in patients with cochlear, auditory brainstem, and auditory midbrain implants: An EEG study

Abstract: There is substantial variability in speech recognition ability across patients with cochlear implants (CIs), auditory brainstem implants (ABIs), and auditory midbrain implants (AMIs). To better understand how this variability is related to central processing differences, the current electroencephalography (EEG) study compared hearing abilities and auditory-cortex activation in patients with electrical stimulation at different sites of the auditory pathway. Three different groups of patients with auditory impla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

5
16
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 21 publications
(21 citation statements)
references
References 107 publications
5
16
0
Order By: Relevance
“…In total, 10 CI users were stimulated on the left and 11 on the right side. Similar to previous studies (e.g., Sandmann et al, 2015 ; Schierholz et al, 2017 ), the participants used a 7-point loudness-rating scale, which allowed adjusting the perceived loudness of the sentences to a comfortable level, equivalent to 60–70 dB ( Allen et al, 1990 ; Zeng, 1994 ).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In total, 10 CI users were stimulated on the left and 11 on the right side. Similar to previous studies (e.g., Sandmann et al, 2015 ; Schierholz et al, 2017 ), the participants used a 7-point loudness-rating scale, which allowed adjusting the perceived loudness of the sentences to a comfortable level, equivalent to 60–70 dB ( Allen et al, 1990 ; Zeng, 1994 ).…”
Section: Methodsmentioning
confidence: 99%
“…The engagement of occipital areas might be related to the fact that although sentences are presented purely auditorily, they might be internally visualized. Furthermore, it has been previously reported that CI users show an enhanced audiovisual coupling (Schierholz et al, 2015(Schierholz et al, , 2017Strelnikov et al, 2015b) and that activation of the visual cortex by auditory stimulation is positively related to the CI performance (Giraud et al, 2001b,c;Strelnikov et al, 2013;Chen et al, 2016), indicating that cross-modal reorganization in the visual cortex and enhanced audiovisual coupling support speech processing in CI users.…”
Section: Erps and Their Relationship With Brain Activation Detected Wmentioning
confidence: 96%
“…First, with regard to visual speech cues, more and more auditory-visual speech tests have been developed that include virtual talking heads or video recordings of real humans (e.g. Schierholz et al 2017;Schreitmüller et al 2018). Also, both virtual and real visual cues have successfully been included in dual-task studies investigating effortful listening (Fraser 2010, Gosselin & Gagné 2011, Stevens et al 2013Desjardins 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Previous studies using unisensory auditory stimuli have reported that N1, N2 and P3a/P3b ERPs of CI users are reduced in amplitude and/or prolonged in latency when compared to NH listeners, suggesting difficulties in the sensory (N1) and the higher-level cognitive processing (N2 and P3a/P3b) of the limited CI input ( Finke et al, 2016 , Finke et al, 2015 , Henkin et al, 2014 , Henkin et al, 2009 , Sandmann et al, 2009 ). ERP differences between CI users and NH listeners have also been reported in studies using basic audio-visual stimuli , pointing to an enhanced visual modulation of auditory ERPs in elderly CI users when compared to NH listeners ( Schierholz et al, 2017 , Stropahl et al, 2015 ). Based on these ERP results and on the observation of a multisensory facilitation effect in NH listeners, we predicted that implanted individuals show a significant audio-visual facilitation for auditory object recognition as well.…”
Section: Introductionmentioning
confidence: 65%
“…These adaptations may also affect interactions between the different sensory modalities ( Lomber et al, 2010 , Rouger et al, 2007 , Schierholz et al, 2015 , Schorr et al, 2005 , Strelnikov et al, 2015 ). In particular, post-lingually deaf CI users show cross-modal reorganization in the auditory cortex ( Rouger et al, 2012 , Sandmann et al, 2012 ) and reveal stronger audio-visual interactions in the auditory cortex when compared with normal-hearing listeners ( Schierholz et al, 2017 ). These cortical alterations in CI users may allow better lip-reading abilities and enhanced audio-visual integration skills ( Stropahl et al, 2017 , Stropahl et al, 2015 ), fostering speech-comprehension recovery after cochlear implantation ( Strelnikov et al, 2013 ).…”
Section: Introductionmentioning
confidence: 99%