Introduction: Despite substantial benefits of cochlear implantation (CI) there is a high variability in speech recognition, the reasons for which are not fully understood. Especially the group of low-performing CI users is underresearched. Because of limited perceptual quality, top-down mechanisms play an important role in decoding the speech signal transmitted by the CI. Thereby, differences in cognitive functioning and linguistic skills may explain speech outcome in these CI subjects. Material and Methods: Fifteen post-lingually deaf CI recipients with a maximum speech perception of 30% in the Freiburger monosyllabic test (low performer ¼ LP) underwent visually presented neurocognitive and linguistic test batteries assessing attention, memory, inhibition, working memory, lexical access, phonological input as well as automatic naming. Nineteen high performer (HP) with a speech perception of more than 70% were included as a control. Pairwise comparison of the two extreme groups and discrimination analysis were carried out.Results: Significant differences were found between LP and HP in phonological input lexicon and word retrieval ( p ¼ 0.0039 ÃÃ ). HP were faster in lexical access ( p ¼ 0.017 Ã ) and distinguished more reliably between non-existing and existing words ( p ¼ 0.0021 ÃÃ ). Furthermore, HP outperformed LP in neurocognitive subtests, most prominently in attention ( p ¼ 0.003 ÃÃ ). LP and HP were primarily discriminated by linguistic performance and to a smaller extent by cognitive functioning (canonic r ¼ 0.68, p ¼ 0.0075). Poor rapid automatic naming of numbers helped to discriminate LP from HP CI users 91.7% of the time. Conclusion: Severe phonologically based deficits in fast automatic speech processing contribute significantly to distinguish LP from HP CI users. Cognitive functions might partially help to overcome these difficulties.
IntroductionHearing loss has a great impact on the people affected, their close partner and the interaction between both, as oral communication is restricted. Nonverbal communication, which expresses emotions and includes implicit information on interpersonal relationship, has rarely been studied in people with hearing impairment (PHI). In psychological settings, non-verbal synchrony of body movements in dyads is a reliable method to study interpersonal relationship.Material and methodsA 10-min social interaction was videorecorded in 39 PHI (29 spouses and 10 parent-child dyads) and their significant others (SOs). Nonverbal synchrony, which means the nonverbal behaviors of two interacting persons (referring to both general synchrony and the role of leading) and verbal interaction (percentage of speech, frequency of repetitions, and queries) were analyzed by computer algorithms and observer ratings. Hearing-related quality of life, coping mechanisms, general psychopathology, quality of relationship, and burden of hearing loss experienced by SOs were assessed using questionnaires.ResultsIn the 39 dyads, true nonverbal synchrony differed from pseudosynchrony [t(43.4) = 2.41; p = 0.02] with a medium effect size (d = 0.42). Gender of PHI had a significant effect on general synchrony (p = 0.025) and on leading by SOs (p = 0.017). Age gap correlated with synchronic movements (p = 0.047). Very short duration of hearing impairment was associated with lower nonverbal synchrony in the role of leading by SOs (p = 0.031). Feeling of closeness by PHI correlated negatively with the role of leading by SOs (p > 0.001) and feeling of closeness by SOs was positively associated with leading by PHI (p = 0.015). No correlation was detected between nonverbal synchrony and other questionnaires. Burden experienced by the SOs was higher in SOs who reported less closeness (p = 0.014).DiscussionA longer hearing impairment leads to more nonverbal leading by SOs compared to PHI with very short duration of hearing loss, possibly because of the long-lasting imbalance in communication. If PHI felt more closeness, SOs led less and vice versa. Burden experienced by SOs negatively correlated with closeness reported by SOs. Use of nonverbal signals and communication might help to improve benefits of auditory rehabilitation for PHI and decrease burden experienced by SOs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.