2015
DOI: 10.1073/pnas.1423080112
|View full text |Cite|
|
Sign up to set email alerts
|

Event representations constrain the structure of language: Sign language as a window into universally accessible linguistic biases

Abstract: According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., “decide,” “sell,” “die”) encode a logical endpoint, whereas atelic verbs (e.g., “think,” “negotiate,” “run”) do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In exp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

5
84
2
1

Year Published

2016
2016
2020
2020

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 78 publications
(100 citation statements)
references
References 36 publications
5
84
2
1
Order By: Relevance
“… Note that this is not specific to spoken languages; sign languages also map meanings into visual sign (see Strickland et al., ). …”
mentioning
confidence: 99%
“… Note that this is not specific to spoken languages; sign languages also map meanings into visual sign (see Strickland et al., ). …”
mentioning
confidence: 99%
“…For example, researchers are exploring whether speakers use gesture in spoken language prosody, called audio-visual prosody (Krahmer & Swerts 2005, 2007); these patterns can then be compared with the patterns found in sign language prosody to explore similarities and differences. As another example, researchers are exploring the ability of nonsigners (who have only their skills as gesturers to draw on) to understand patterns in sign language—sign language prosody (Fenlon et al 2007, Brentari et al 2011) and sign language structures that encode notions of telicity (Strickland et al 2015). Researchers are also exploring whether hearing people who have experience using cospeech gesture will have a different “accent” when they sign, not because they learned to sign as a second language but simply because they have had extensive cospeech gesture experience.…”
Section: Surprising Results and Further Implications Of Work On Lamentioning
confidence: 99%
“…The pioneering research of Rachel Mayberry and her colleagues has shown that language experience with a sign language facilitates the acquisition of spoken language (Hall, Ferreira, & Mayberry, 2012;Mayberry, Lock, & Kazmi, 2002). Another study has shown that speakers spontaneously constrain the semantic interpretation of signs (specifically, their telicity, Strickland et al, 2015). These results, however, do not directly establish whether cross-modal transfer is due to the projection of grammatical principles, specifically.…”
Section: Cross-modal Projectionsmentioning
confidence: 98%
“…The pioneering research of Rachel Mayberry and her colleagues has shown that language experience with a sign language facilitates the acquisition of spoken language (Hall, Ferreira, & Mayberry, 2012;Mayberry, Lock, & Kazmi, 2002). Indeed, early exposure to a sign language (in Hall et al, 2012;Mayberry et al, 2002) can offer social and cognitive benefits, and the constraints on the interpretation of signs (Strickland et al, 2015) could be explained not by grammatical principles but by iconicity. These results, however, do not directly establish whether cross-modal transfer is due to the projection of grammatical principles, specifically.…”
Section: Cross-modal Projectionsmentioning
confidence: 99%