2021
DOI: 10.1038/s41598-021-85802-5
|View full text |Cite
|
Sign up to set email alerts
|

Convergence of heteromodal lexical retrieval in the lateral prefrontal cortex

Abstract: Lexical retrieval requires selecting and retrieving the most appropriate word from the lexicon to express a desired concept. Few studies have probed lexical retrieval with tasks other than picture naming, and when non-picture naming lexical retrieval tasks have been applied, both convergent and divergent results emerged. The presence of a single construct for auditory and visual processes of lexical retrieval would influence cognitive rehabilitation strategies for patients with aphasia. In this study, we perfo… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

4
5

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 80 publications
(38 reference statements)
4
8
0
Order By: Relevance
“…In natural speech, visual articulations typically occur before the onset of speech-related sounds (typically within 40 - 200 ms of speech onset (33)). Because of this pre-articulatory visual information, visemes evoked increased beta suppression beginning before the expected phoneme onset time (−250 to 0 ms) [ t (5) = 2.78, p = .039, d = 1.13]. This is consistent with past research (15) and indicative of possible lateral(34) or feedback inputs (3537) into the auditory system.…”
Section: Resultssupporting
confidence: 86%
See 1 more Smart Citation
“…In natural speech, visual articulations typically occur before the onset of speech-related sounds (typically within 40 - 200 ms of speech onset (33)). Because of this pre-articulatory visual information, visemes evoked increased beta suppression beginning before the expected phoneme onset time (−250 to 0 ms) [ t (5) = 2.78, p = .039, d = 1.13]. This is consistent with past research (15) and indicative of possible lateral(34) or feedback inputs (3537) into the auditory system.…”
Section: Resultssupporting
confidence: 86%
“…Visual speech improves auditory speech perception during face-to-face conversations (1, 2). These benefits are strongest in noisy situations (3) and in individuals with hearing loss due to healthy aging (4), intrinsic brain tumor (5), stroke (6, 7), concussion (8, 9), or cochlear implants (10). However, there is limited understanding of how the brain enables vision to facilitate hearing processes.…”
Section: Introductionmentioning
confidence: 99%
“…A variety of different methods have previously been used to isolate brain regions essential to naming, including lesion deficit mapping, [9][10][11][12][13] functional imaging, [14][15][16] and noninvasive electroencephalography (EEG). 17 More recently, studies using invasive electrocorticography (ECoG) have yielded a more precise delineation of the neurophysiological basis of word production.…”
Section: Introductionmentioning
confidence: 99%
“…The main attributing factor to the finding could be an influence of left frontal DLGG on the WM fibers. It has been established that IFOF, SLF‐II and AF played a crucial role in language function 34,35 . Particularly, according to the ‘dual stream’ theory of language processing, the IFOF, ILF and UF integrate information within the ventral stream where semantic and syntactic analyses predominate, whereas the SLF and AF belong to the dorsal stream where phonological and articulatory processing occur 36 .…”
Section: Discussionmentioning
confidence: 99%