Abstract:In this study, localization accuracy and sensitivity to acoustic interaural time differences (ITDs) in subjects using cochlear implants with combined electric-acoustic stimulation (EAS) were assessed and compared with the results of a normal hearing control group. Methods Eight CI users with EAS (2 bilaterally implanted, 6 unilaterally implanted) and symmetric binaural acoustic hearing and 24 normal hearing subjects participated in the study. The first experiment determined mean localization error (MLE) for di… Show more
“…We hypothesize that hearing asymmetries introduce localization biases to the better hearing ear ( Figure 1b to d). This would suggest that idiosyncratic localization abilities as demonstrated in earlier studies ( Dunn et al, 2010 ; Gifford Grantham et al, 2014 ; Dorman et al, 2016 ; Körtje et al, 2020 ) with errors ranging from near-normal (10 deg) to extremely poor (70 deg) may well depend on individual hearing asymmetries. Indeed, more symmetric hearing in bimodal EAS users correlates with the ability to process ITDs ( Gifford Grantham et al, 2014 ) and localize sounds ( Loiselle et al, 2015 ).…”
Section: Introductionmentioning
confidence: 78%
“…Criteria for cochlear implantation have been expanded to allow adults with good low-frequency hearing thresholds and severe-to-profound high-frequency hearing loss in both ears to receive a single cochlear implant. The combined electric and acoustic stimulation (EAS) in the implanted ear and acoustic stimulation in the other provides these cochlear implant recipients with binaural low-frequency acoustic hearing, which has been shown to improve horizontal-plane sound localization ( Dunn et al, 2010 ; Gifford et al., 2014 ; Plant & Babic; 2016; Kortje et al, 2020 ). However, these EAS listeners effectively have monaural (electrical) hearing for high frequencies and asymmetric (bimodal) hearing for mid frequencies.…”
Section: Introductionmentioning
confidence: 99%
“…The EAS listeners with low-frequency binaural hearing ( Figure 1a , low frequency region) can potentially localize sounds in the horizontal plane using interaural time differences (ITDs; Figure 1b ( Dorman et al, 2013 ; Gifford et al., 2013 ; Gifford Graham et. al., 2014 ; Dorman et al, 2016 ; Gifford & Stecker, 2020 ; Körtje et al, 2020 ). However, typical EAS listeners will have no high-frequency hearing in the non-implanted ear ( Figure 1a , high frequencies) and so they lack access to interaural level differences (ILDs) necessary for high-frequency horizontal ( Blauert, 1996 ) sound localization ( Figure 1d ).…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, we hypothesize that, as hearing asymmetry depends on frequency ( Figure 1a ), sound localization by EAS listeners depends on the sound spectrum as well. Specifically, since bimodal EAS users with binaurally symmetric low-frequency hearing can rely on low-frequency ITDs ( Gifford Grantham et al, 2014 ; Kortje et. al., 2020 ; Figure 1a ), they should be able to localize low-frequency sounds accurately ( Figure 1b ).…”
Many cochlear implant users with binaural residual (acoustic) hearing benefit from combining electric and acoustic stimulation (EAS) in the implanted ear with acoustic amplification in the other. These bimodal EAS listeners can potentially use low-frequency binaural cues to localize sounds. However, their hearing is generally asymmetric for mid- and high-frequency sounds, perturbing or even abolishing binaural cues. Here, we investigated the effect of a frequency-dependent binaural asymmetry in hearing thresholds on sound localization by seven bimodal EAS listeners. Frequency dependence was probed by presenting sounds with power in low-, mid-, high-, or mid-to-high-frequency bands. Frequency-dependent hearing asymmetry was present in the bimodal EAS listening condition (when using both devices) but was also induced by independently switching devices on or off. Using both devices, hearing was near symmetric for low frequencies, asymmetric for mid frequencies with better hearing thresholds in the implanted ear, and monaural for high frequencies with no hearing in the non-implanted ear. Results show that sound-localization performance was poor in general. Typically, localization was strongly biased toward the better hearing ear. We observed that hearing asymmetry was a good predictor for these biases. Notably, even when hearing was symmetric a preferential bias toward the ear using the hearing aid was revealed. We discuss how frequency dependence of any hearing asymmetry may lead to binaural cues that are spatially inconsistent as the spectrum of a sound changes. We speculate that this inconsistency may prevent accurate sound-localization even after long-term exposure to the hearing asymmetry.
“…We hypothesize that hearing asymmetries introduce localization biases to the better hearing ear ( Figure 1b to d). This would suggest that idiosyncratic localization abilities as demonstrated in earlier studies ( Dunn et al, 2010 ; Gifford Grantham et al, 2014 ; Dorman et al, 2016 ; Körtje et al, 2020 ) with errors ranging from near-normal (10 deg) to extremely poor (70 deg) may well depend on individual hearing asymmetries. Indeed, more symmetric hearing in bimodal EAS users correlates with the ability to process ITDs ( Gifford Grantham et al, 2014 ) and localize sounds ( Loiselle et al, 2015 ).…”
Section: Introductionmentioning
confidence: 78%
“…Criteria for cochlear implantation have been expanded to allow adults with good low-frequency hearing thresholds and severe-to-profound high-frequency hearing loss in both ears to receive a single cochlear implant. The combined electric and acoustic stimulation (EAS) in the implanted ear and acoustic stimulation in the other provides these cochlear implant recipients with binaural low-frequency acoustic hearing, which has been shown to improve horizontal-plane sound localization ( Dunn et al, 2010 ; Gifford et al., 2014 ; Plant & Babic; 2016; Kortje et al, 2020 ). However, these EAS listeners effectively have monaural (electrical) hearing for high frequencies and asymmetric (bimodal) hearing for mid frequencies.…”
Section: Introductionmentioning
confidence: 99%
“…The EAS listeners with low-frequency binaural hearing ( Figure 1a , low frequency region) can potentially localize sounds in the horizontal plane using interaural time differences (ITDs; Figure 1b ( Dorman et al, 2013 ; Gifford et al., 2013 ; Gifford Graham et. al., 2014 ; Dorman et al, 2016 ; Gifford & Stecker, 2020 ; Körtje et al, 2020 ). However, typical EAS listeners will have no high-frequency hearing in the non-implanted ear ( Figure 1a , high frequencies) and so they lack access to interaural level differences (ILDs) necessary for high-frequency horizontal ( Blauert, 1996 ) sound localization ( Figure 1d ).…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, we hypothesize that, as hearing asymmetry depends on frequency ( Figure 1a ), sound localization by EAS listeners depends on the sound spectrum as well. Specifically, since bimodal EAS users with binaurally symmetric low-frequency hearing can rely on low-frequency ITDs ( Gifford Grantham et al, 2014 ; Kortje et. al., 2020 ; Figure 1a ), they should be able to localize low-frequency sounds accurately ( Figure 1b ).…”
Many cochlear implant users with binaural residual (acoustic) hearing benefit from combining electric and acoustic stimulation (EAS) in the implanted ear with acoustic amplification in the other. These bimodal EAS listeners can potentially use low-frequency binaural cues to localize sounds. However, their hearing is generally asymmetric for mid- and high-frequency sounds, perturbing or even abolishing binaural cues. Here, we investigated the effect of a frequency-dependent binaural asymmetry in hearing thresholds on sound localization by seven bimodal EAS listeners. Frequency dependence was probed by presenting sounds with power in low-, mid-, high-, or mid-to-high-frequency bands. Frequency-dependent hearing asymmetry was present in the bimodal EAS listening condition (when using both devices) but was also induced by independently switching devices on or off. Using both devices, hearing was near symmetric for low frequencies, asymmetric for mid frequencies with better hearing thresholds in the implanted ear, and monaural for high frequencies with no hearing in the non-implanted ear. Results show that sound-localization performance was poor in general. Typically, localization was strongly biased toward the better hearing ear. We observed that hearing asymmetry was a good predictor for these biases. Notably, even when hearing was symmetric a preferential bias toward the ear using the hearing aid was revealed. We discuss how frequency dependence of any hearing asymmetry may lead to binaural cues that are spatially inconsistent as the spectrum of a sound changes. We speculate that this inconsistency may prevent accurate sound-localization even after long-term exposure to the hearing asymmetry.
“…The preservation of the sensitivity to the binaural gap even when an interaural delay was introduced indicated that acoustic temporal fine structure features of noise were maintained for the duration of the interaural interval [11,25,26] and allowed similarity computation between the binaural sound inputs. Thus, measuring the impact of interaural delay when the binaural gap is detected [27][28][29] can provide a way of investigating the transient memory of acoustic features.…”
Humans are able to detect an instantaneous change in correlation, demonstrating an ability to temporally process extremely rapid changes in interaural configurations. This temporal dynamic is correlated with human listeners’ ability to store acoustic features in a transient auditory manner. The present study investigated whether the ability of transient auditory storage of acoustic features was affected by the interaural delay, which was assessed by measuring the sensitivity for detecting the instantaneous change in correlation for both wideband and narrowband correlated noise with various interaural delays. Furthermore, whether an instantaneous change in correlation between correlated interaural narrowband or wideband noise was detectable when introducing the longest interaural delay was investigated. Then, an auditory computational description model was applied to explore the relationship between wideband and narrowband simulation noise with various center frequencies in the auditory processes of lower-level transient memory of acoustic features. The computing results indicate that low-frequency information dominated perception and was more distinguishable in length than the high-frequency components, and the longest interaural delay for narrowband noise signals was highly correlated with that for wideband noise signals in the dynamic process of auditory perception.
The middle-ear system relies on a balance of mass and stiffness characteristics for transmitting sound from the external environment to the cochlea and auditory neural pathway. Phase is one aspect of sound that, when transmitted and encoded by both ears, contributes to binaural cue sensitivity and spatial hearing. The study aims were (i) to investigate the effects of middle-ear stiffness on the auditory brainstem neural encoding of phase in human adults with normal pure-tone thresholds and (ii) to investigate the relationships between middle-ear stiffness-induced changes in wideband acoustic immittance and neural encoding of phase. The auditory brainstem neural encoding of phase was measured using the auditory steady-state response (ASSR) with and without middle-ear stiffness elicited via contralateral activation of the middle-ear muscle reflex (MEMR). Middle-ear stiffness was quantified using a wideband acoustic immittance assay of acoustic absorbance. Statistical analyses demonstrated decreased ASSR phase lag and decreased acoustic absorbance with contralateral activation of the MEMR, consistent with increased middle-ear stiffness changing the auditory brainstem neural encoding of phase. There were no statistically significant correlations between stiffness-induced changes in wideband acoustic absorbance and ASSR phase. The findings of this study may have important implications for understanding binaural cue sensitivity and horizontal plane sound localization in audiologic and otologic clinical populations that demonstrate changes in middle-ear stiffness, including cochlear implant recipients who use combined electric and binaural acoustic hearing and otosclerosis patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.