2013
DOI: 10.1016/j.cognition.2012.09.010
|View full text |Cite
|
Sign up to set email alerts
|

Info/information theory: Speakers choose shorter words in predictive contexts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

6
135
2
1

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
2
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 175 publications
(144 citation statements)
references
References 11 publications
6
135
2
1
Order By: Relevance
“…For example, speakers are more likely to omit optional arguments when their semantics is inferable from the verb (Resnik, 1996 ; see also Brown & Dell, 1987 ). Similarly, speakers are more likely to choose more reduced expressions when the referent is expected in context (e.g., pronoun vs. lexical noun: Tily & Piantadosi, 2009 ; abbreviated vs. full nouns, like math vs. mathematics : Mahowald, Fedorenko, Piantadosi, & Gibson, 2013 ).…”
Section: Introductionmentioning
confidence: 99%
“…For example, speakers are more likely to omit optional arguments when their semantics is inferable from the verb (Resnik, 1996 ; see also Brown & Dell, 1987 ). Similarly, speakers are more likely to choose more reduced expressions when the referent is expected in context (e.g., pronoun vs. lexical noun: Tily & Piantadosi, 2009 ; abbreviated vs. full nouns, like math vs. mathematics : Mahowald, Fedorenko, Piantadosi, & Gibson, 2013 ).…”
Section: Introductionmentioning
confidence: 99%
“…In the context of quantitative linguistics, entropic measures are used to understand laws in natural languages, such as the relationship between word frequency, predictability and the length of words [24][25][26][27], or the trade-off between word structure and sentence structure [10,13,28]. Information theory can further help to understand the complexities involved when building words from the smallest meaningful units, i.e., morphemes [29,30].…”
Section: Introductionmentioning
confidence: 99%
“…Our analyses are motivated by recent studies that show a message's Information-Theoretic structure is influenced at a variety of linguistic levels including syntactic variation and phonetic reduction (Aylett, 1999;Genzel & Charniak, 2002;Aylett & Turk, 2006;2004;Levy & Jaeger, 2006;Jaeger, 2010;Mahowald et al, 2013). Specifically, the amount of information present across a message is shown to increase over time, abiding by the entropy rate constancy principlea message's information density increases at a stable rate (Genzel & Charniak, 2002)-perhaps in an effort to provide relevant content against channel noise.…”
Section: Current Studymentioning
confidence: 99%