2012
DOI: 10.1121/1.3664101
|View full text |Cite
|
Sign up to set email alerts
|

Comparing the effects of reverberation and of noise on speech recognition in simulated electric-acoustic listening

Abstract: Cochlear implant users report difficulty understanding speech in both noisy and reverberant environments. Electric-acoustic stimulation (EAS) is known to improve speech intelligibility in noise. However, little is known about the potential benefits of EAS in reverberation, or about how such benefits relate to those observed in noise. The present study used EAS simulations to examine these questions. Sentences were convolved with impulse responses from a model of a room whose estimated reverberation times were … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
17
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(19 citation statements)
references
References 71 publications
1
17
1
Order By: Relevance
“…The lack of spectral smearing for acoustic stimuli with bimodal simulations is consistent with other studies examining bimodal hearing for normal-hearing individuals (Brown and Bacon 2009a, 2009b; Chang et al 2006; Helms Tillery et al 2012; Kong and Carlyon 2007; Qin et al 2006). For example, Kong and Carolyn (2007) tested sentence recognition at a +5 dB SNR and found an average of approximately 25-percentage points bimodal benefit.…”
Section: Discussionsupporting
confidence: 89%
“…The lack of spectral smearing for acoustic stimuli with bimodal simulations is consistent with other studies examining bimodal hearing for normal-hearing individuals (Brown and Bacon 2009a, 2009b; Chang et al 2006; Helms Tillery et al 2012; Kong and Carlyon 2007; Qin et al 2006). For example, Kong and Carolyn (2007) tested sentence recognition at a +5 dB SNR and found an average of approximately 25-percentage points bimodal benefit.…”
Section: Discussionsupporting
confidence: 89%
“…Syllable identification depends more on the initial than on the finial. Moreover, in the time domain, the reverberation smears the time gap between syllables, which also reduces the speech intelligibility (Tillery et al, 2012). When reverberation and noise are combined, especially with long RT, reverberation does not only strongly mask the initial, final, and tone, but it also enhances the noise level that increases the masking effect of noise on the initial, final, and tone.…”
Section: Resultsmentioning
confidence: 99%
“…It is now widely accepted that electric-acoustic stimulation (EAS) in the form of bimodal hearing [cochlear implant (CI) supplemented by low-frequency acoustic hearing in the contralateral ear] or hybrid hearing (CI supplemented by low-frequency acoustic hearing preserved postoperatively in the same ear) has the potential to enhance speech understanding relative to a CI alone (see reviews by Gifford, 2008, 2010). This is especially true when the target speech signal occurs in the presence of competing maskers or background noise (e.g., real EAS users: Kong et al, 2005;Zhang et al, 2010;Carroll et al, 2011;Visram et al, 2012a,b;Neuman and Svirsky, 2013; simulated EAS listeners: Qin and Oxenham, 2006;Li and Loizou, 2008;Tillery et al, 2012).…”
mentioning
confidence: 99%