2019
DOI: 10.1111/tops.12415
|View full text |Cite
|
Sign up to set email alerts
|

How to Make the Most out of Very Little

Abstract: I review the problem of referential ambiguity that arises when children learn the meanings of words, along with a number of models that have been proposed to solve it. I then provide a formal analysis of why a resource‐limited model that retains very few meaning hypotheses may be more effective than “big data” models that keep track of all word‐meaning associations.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 56 publications
(80 reference statements)
0
5
0
Order By: Relevance
“…This result suggests that both computational mechanisms may be involved in cross‐situational learning (Romberg & Yu, 2014) and that they may work together in an integrated learning system. Relatedly, a recent variant of the PbV model called Pursuit (Stevens et al., 2017; Yang, 2020) has also demonstrated how hypothesis testing and associative learning can be integrated. Unlike PbV, in the Pursuit model, rejected hypotheses are not completely disregarded, but instead are retained for later evaluation.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This result suggests that both computational mechanisms may be involved in cross‐situational learning (Romberg & Yu, 2014) and that they may work together in an integrated learning system. Relatedly, a recent variant of the PbV model called Pursuit (Stevens et al., 2017; Yang, 2020) has also demonstrated how hypothesis testing and associative learning can be integrated. Unlike PbV, in the Pursuit model, rejected hypotheses are not completely disregarded, but instead are retained for later evaluation.…”
Section: Discussionmentioning
confidence: 99%
“…Similar to PbV, the Pursuit model also only stores one referent per learning instance. Stevens et al (2017) ran a series of model comparisons and found that the Pursuit model outperforms associative learning and PbV models (Yang, 2020). Although the Pursuit model is still called a localist model where learners store one referent at the time, the fact that rejected hypotheses can be later retrieved and evaluated suggest that past knowledge plays a role and learning does not only rely on information provided at the moment.…”
Section: An Integrated View Of Associative Learning and Hypothesis Testingmentioning
confidence: 99%
“…Quine, ). Both Yang () and Newport () tackle this problem from a computational perspective, asking what kinds of computations are most likely to be used by child learners (and for Newport, how this differs from adult learners). The answer in both cases can be captured by an expression first coined by Newport (): Less is more.…”
Section: Learning Mechanisms: Computational Approachesmentioning
confidence: 99%
“…Yang () starts with the problem of referential ambiguity also addressed in the paper by Gleitman and Trueswell. Complementing that paper, Yang provides computational arguments and evidence that big data approaches to resolving referential ambiguity are destined to fail, on account of the inevitable computational explosion needed to continually keep track of the contextual associations that are present when a word is uttered.…”
Section: Learning Mechanisms: Computational Approachesmentioning
confidence: 99%
“…Frank, Goldwater, & Keller, 2009;Siskind, 1996;. Other evidence supports an hypothesis-testing mechanism in which learning is more discrete, with learners selecting the most likely meaning for a word in a given moment and subsequently confirming or falsifying this hypothesis as new information becomes available in subsequent word usages (Medina, Snedeker, Trueswell, & Gleitman, 2011;Trueswell, Medina, Hafri, & Gleitman, 2013;Yang, 2020). While the use of one or the other mechanism may depend on attentional and memory demands (Yurovsky & Frank, 2015), both mechanisms focus on how learners use their objective experience with the world, in and across learning exposures, to generate and evaluate word meaning hypotheses and do not attempt to capture the influence of more subjective processes.…”
mentioning
confidence: 98%