2020
DOI: 10.1037/xge0000715
|View full text |Cite
|
Sign up to set email alerts
|

The use of number words in natural language obeys Weber’s law.

Abstract: It has been suggested that the origins of number words can be traced back to an evolutionarily ancient approximate number system, which represents quantities on a compressed scale and complies with Weber's law. Here, we use a data-driven computational model, which learns to predict one event (a word in a text corpus) from associated events, to characterize verbal behavior relative to number words in natural language, without appeal to perception. We show that the way humans use number words in spontaneous lang… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

4
5

Authors

Journals

citations
Cited by 19 publications
(28 citation statements)
references
References 134 publications
0
28
0
Order By: Relevance
“…Words with both high vector and neighborhood coherence, on the contrary, tend to refer to time‐related concepts, body parts, and numerals. The fact that we observe peculiar effects of language change (or better lack thereof) for number expressions in word embeddings is further evidence that language encodes important aspects of numbers and can inform the way we represent them cognitively (Rinaldi & Marelli, 2019). Our results show that number words tend to maintain coherent word representations over time, suggesting a central role in language, which should be further studied considering synchronic and diachronic patterns.…”
Section: Discussionmentioning
confidence: 54%
“…Words with both high vector and neighborhood coherence, on the contrary, tend to refer to time‐related concepts, body parts, and numerals. The fact that we observe peculiar effects of language change (or better lack thereof) for number expressions in word embeddings is further evidence that language encodes important aspects of numbers and can inform the way we represent them cognitively (Rinaldi & Marelli, 2019). Our results show that number words tend to maintain coherent word representations over time, suggesting a central role in language, which should be further studied considering synchronic and diachronic patterns.…”
Section: Discussionmentioning
confidence: 54%
“…However, in the present article we show that there are serious limitations to this approach, demonstrating that a simple transposition between physical properties of the outside world and language statistics can break down completely (see also Rinaldi & Marelli, 2020b). We test as a prime example for such a breakdown the human body (and more specifically, the size of individual body parts), arguably one of the most salient visual stimuli we are exposed to throughout our lifetime.…”
Section: Cortical Maps Recovered From Language Statisticsmentioning
confidence: 67%
“…Inspired by the work of Louwerse and colleagues (e.g. Louwerse, 2011), the researchers involved in this project have been working together for some time on the theoretical idea that natural language statistics -when properly examined through rigorous computational techniques as used in the field of natural language processing -offer a precious window into how the human brain represents experience with the external world (Günther et al, 2019;Rinaldi & Marelli, 2020a, 2020b. The present study examines as a prime example for this idea the human body, whose mental representation has long been the subject of Luca…”
Section: Contextmentioning
confidence: 99%
“…This is because language is not at all independent from the physical world we live in, but instead often used to communicate about this very world (Louwerse, 2011). This leads to statistical redundancies between the structure of the physical, directly-perceivable world on the one hand, and the structure of language on the other hand, so that relations between words tend to reflect the relations between their referents (Johns & Jones, 2012;Louwerse, 2011;Rinaldi & Marelli, 2020) -which in turn influences the training of distributional semantic models and the dimensional values of the resulting distributional vectors. The possibility of encoding grounded information through language usage makes it possible for our model to work, by capturing this information and systematically linking it to information from the visual domain.…”
Section: Encoding Perceptually-related Information In Distributional mentioning
confidence: 99%