2022
DOI: 10.1037/xlm0001122
|View full text |Cite
|
Sign up to set email alerts
|

Exposure to co-occurrence regularities in language drives semantic integration of new words.

Abstract: Human word learning is remarkable: We not only learn thousands of words but also form organized semantic networks in which words are interconnected according to meaningful links, such as those between apple, juicy, and pear. These links play key roles in our abilities to use language. How do words become integrated into our semantic networks? Here, we investigated whether humans integrate new words by harnessing simple statistical regularities of word use in language, including: (a) Direct co-occurrence (e.g.,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 100 publications
0
6
0
Order By: Relevance
“…The [12] implementation of COALS lends credence to [13]'s observation that "words are like the people with whom they are most closely associated," implying, however, that this association is limited to immediate neighbors rather than all acquaintances. As a result, a lot of meaning may be gleaned from the co-occurrence of words in a given context (after accounting for randomness).…”
Section: Comparing Semantic Modelsmentioning
confidence: 94%
“…The [12] implementation of COALS lends credence to [13]'s observation that "words are like the people with whom they are most closely associated," implying, however, that this association is limited to immediate neighbors rather than all acquaintances. As a result, a lot of meaning may be gleaned from the co-occurrence of words in a given context (after accounting for randomness).…”
Section: Comparing Semantic Modelsmentioning
confidence: 94%
“…With respect to semantics, linguistic distributional models have been shown to accurately predict human semantic priming (Günther et al, 2016; Lund et al, 1995; Mandera et al, 2017). Moreover, word co-occurrence plays an important role in word learning, both for children and adults (Savic et al, 2022; Unger et al, 2020). However, previous studies have mostly focused on lexical semantics rather than meaning at the discourse level.…”
Section: Statistical Correlationsmentioning
confidence: 99%
“…Statistical regularities have been shown to influence phonological, lexical, and syntactic processing (cf. Maye et al, 2002; Saffran et al, 1996; Savic et al, 2022), but it is not yet clear whether readers are also sensitive to co-occurrence at the discourse level. Beyond the sentence level, most previous research has focused on lexical cues, such as connectives (e.g., Sanders & Noordman, 2000; Xiang & Kuperberg, 2015) or readers’ background knowledge (Cozijn et al, 2011; Marchal, Scholman, & Demberg, 2022c), and how these facilitate relational inferences.…”
mentioning
confidence: 99%
“…Furthermore, with regard to contemporary enhancement techniques, the majority of clustering units predominantly operate at the document level rather than at the finer-grained word level. Nonetheless, itʹs imperative to recognize that words, being the fundamental building blocks of any document, wield significant influence in the realm of semantic partitioning [28], [29]. Hence, it becomes imperative to explore ways to effectively integrate semantic and part-of-speech (POS) features within text for an enhanced comprehension and analysis of semantic partitioning.…”
Section: Related Workmentioning
confidence: 99%