2012
DOI: 10.1515/langcog-2012-0001
|View full text |Cite
|
Sign up to set email alerts
|

Language comprehenders represent object distance both visually and auditorily

Abstract: Abstract. While the arbitrariness of the sign has occupied a central space in linguistic theory for a century, counter-evidence to this basic tenet has been mounting. Recent findings from cross-linguistic studies on spoken languages have suggested that, contrary to purely arbitrary distributions of phonological content, languages often exhibit systematic and regular phonological and sub-phonological patterns of form-meaning mappings. To date, studies of distributional tendencies of this kind have not been cond… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
31
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(36 citation statements)
references
References 53 publications
(80 reference statements)
3
31
0
Order By: Relevance
“…Even just listening to language can trigger these activations in the associative cortex. The sentence "the alarm sounded and John jumped out of bed" will activate areas in the auditory and motor cortex related to alarms and jumping out of bed [Kaschak et al 2006;Winter and Bergen 2012]. This is the received embodiment view.…”
Section: Overview and Organizationmentioning
confidence: 96%
“…Even just listening to language can trigger these activations in the associative cortex. The sentence "the alarm sounded and John jumped out of bed" will activate areas in the auditory and motor cortex related to alarms and jumping out of bed [Kaschak et al 2006;Winter and Bergen 2012]. This is the received embodiment view.…”
Section: Overview and Organizationmentioning
confidence: 96%
“…Further, cross-linguistic research shows many languages make a distinction between near and far space in demonstratives (e.g., this and that) (Diessel, 1999;Levinson et al, in press), suggesting the psychological salience of this spatial distinction (see also Kemmerer, 1999). Winter and Bergen (2012) have shown spatial distance is actually simulated during sentence comprehension. They found participants were faster to respond to large pictures and loud sounds after reading sentences describing objects near in space rather than far in space (e.g., You are looking at the milk bottle in the fridge/across the supermarket), and faster to respond to small pictures and quiet sounds after sentences describing objects that are far rather than near.…”
Section: Word Meaning Perceptual Simulation and Spacementioning
confidence: 97%
“…Previous work shows objects can be simulated in near and far space based on their described location (Winter & Bergen, 2012), but can the simulation of near and far space also be affected by the postulated differences in perceptual modalities described above? If so, then objects strongly experienced in the visual modality (e.g., traffic light) should be perceptually simulated in the distance, as should objects strongly experienced in the auditory modality (e.g., thunder).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Mental simulation studies reveal that spoken language users engage areas of the brain responsible for physical motor routines (without producing the physical action) (Glenberg & Kaschak 2002), and vision (Kosslyn, Ganis & Thompson 2001) when engaging in linguistic tasks. Winter and Bergen (2012) have shown that linguistic processing which references what something sounds like engages perceptual representations of what those relevant objects/events sound like. Work on simulation clearly shows that vision, motor routines, audition, and proprioception are all at work in dynamic construction of meaning.…”
Section: An Embodied Cognitive Phonologymentioning
confidence: 99%