2023
DOI: 10.1038/s41598-022-27029-6
|View full text |Cite
|
Sign up to set email alerts
|

Feature-rich multiplex lexical networks reveal mental strategies of early language learning

Abstract: Knowledge in the human mind exhibits a dualistic vector/network nature. Modelling words as vectors is key to natural language processing, whereas networks of word associations can map the nature of semantic memory. We reconcile these paradigms—fragmented across linguistics, psychology and computer science—by introducing FEature-Rich MUltiplex LEXical (FERMULEX) networks. This novel framework merges structural similarities in networks and vector features of words, which can be combined or explored independently… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 68 publications
(144 reference statements)
0
2
0
Order By: Relevance
“…In addition to the suggestions above for future research, researchers may also consider using feature-rich networks [56], which merges the structural characteristics of networks with vector-based information about individual words (e.g., word frequency, word length, part of speech, etc.). The structural information can be examined independent of the vector-based information, or the two types of information can be combined to reveal patterns that neither approach could reveal by themselves.…”
Section: Plos Onementioning
confidence: 99%
“…In addition to the suggestions above for future research, researchers may also consider using feature-rich networks [56], which merges the structural characteristics of networks with vector-based information about individual words (e.g., word frequency, word length, part of speech, etc.). The structural information can be examined independent of the vector-based information, or the two types of information can be combined to reveal patterns that neither approach could reveal by themselves.…”
Section: Plos Onementioning
confidence: 99%
“…Using the valence labels for the key concepts and associated responses, we enriched the BFMNs, representing them as feature-rich cognitive networks [45] in which information about the sentiment of associative responses could be used to describe the properties of the cue word [27]. As in previous works, we leveraged the notion of a node's neighborhood, consisting of the set of adjacent nodes to a target node: in this case, the neighborhoods of a cue word were the sets of all the associative responses generated by the participants (the language models or humans) responding to the same set of instructions.…”
Section: Network Building and Semantic Frame Reconstructionmentioning
confidence: 99%
“…Using the valence labels for the key concepts and associated responses, we enriched the BFMNs, representing them as feature-rich cognitive networks [15] in which information about the sentiment of associative responses can be used to describe the properties of the cue word [65]. As in previous works, we leveraged the notion of a node's neighborhood, consisting of the set of adjacent nodes to a target node: In this case, the neighborhoods of a cue word are the sets of all the associative responses generated by the participants (the language models or humans) responding to the same set of instructions.…”
Section: Network Building and Semantic Frame Reconstructionmentioning
confidence: 99%