2023
DOI: 10.1016/j.neubiorev.2023.105130
|View full text |Cite
|
Sign up to set email alerts
|

Audiovisual multisensory integration in individuals with reading and language impairments: A systematic review and meta-analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 87 publications
0
1
0
Order By: Relevance
“…At school, audition and vision are often paired and contribute to various learning processes, in particular regarding language and reading. Indeed, poorer readers exhibit impaired audiovisual integration 46 , 76 , 77 . In addition, multisensory integration also interacts with memory functions 12 , 78 .…”
Section: Discussionmentioning
confidence: 99%
“…At school, audition and vision are often paired and contribute to various learning processes, in particular regarding language and reading. Indeed, poorer readers exhibit impaired audiovisual integration 46 , 76 , 77 . In addition, multisensory integration also interacts with memory functions 12 , 78 .…”
Section: Discussionmentioning
confidence: 99%
“…There is indeed evidence that individuals with DD do not always benefit from lipread speech to the same extent as typical readers do, especially when auditory speech is difficult to understand (de Gelder and Vroomen, 1998;Hahn et al, 2014;Ramirez and Mann, 2005;Rüsseler et al, 2015Rüsseler et al, , 2018Schaadt et al, 2019;van Laarhoven et al, 2018). However, the finding that dyslexics rely less on lip-read information is not always observed (for a recent review, see Pulliam et al, 2023). Here, we examined a so far untested function of lip-read speech, namely in guiding adaptation to acoustically distorted speech.…”
Section: Introductionmentioning
confidence: 99%
“…The framework suggested from their results, presents two distinct systems one for non-linguistic and another for linguistic semantic information, with the first possibly accessing modality-specific semantic representations while the second accessing amodal or multisensory semantic representations. Other attempts were limited to individuals with reading and language impairments aiming to identify association of the differences in audiovisual integration with neurodiversity, without however being able to identify the nature of these differences [12]. Thus, understanding the impact of complex language processing through audio in crossmodal integration is something currently pending.…”
Section: Introductionmentioning
confidence: 99%