2008
DOI: 10.1037/0096-3445.137.2.244
|View full text |Cite
|
Sign up to set email alerts
|

Visual representation of eye gaze is coded by a nonopponent multichannel system.

Abstract: To date, there is no functional account of the visual perception of gaze in humans. Previous work has demonstrated that left gaze and right gaze are represented by separate mechanisms. However, these data are consistent with either a multichannel system comprising separate channels for distinct gaze directions (e.g., left, direct, and right) or an opponent-coding system in which all gaze directions are coded by just 2 pools of cells, one coding left gaze and the other right, with direct gaze represented as a n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

10
142
2

Year Published

2010
2010
2018
2018

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 102 publications
(154 citation statements)
references
References 48 publications
(95 reference statements)
10
142
2
Order By: Relevance
“…More importantly, our findings, together with recent studies showing effects of adaptation to gaze direction (Calder et al, 2008;Jenkins et al, 2006), emotional expressions (e.g., Bestelmeyer et al, 2010;Hsu & Young, 2004; As mentioned above, previous research on recalibration of perceptions of lip speech has emphasized the effects of recalibration following exposure to auditory stimuli (Eisner & McQueen, 2005;van Linden & Vroomen, 2007;Vroomen & Baart, 2009). By contrast with this emphasis on auditory recalibration, our findings suggest that perceptions of lip speech also can be recalibrated by visual adaptation to different mouth shapes.…”
Section: Discussioncontrasting
confidence: 53%
See 3 more Smart Citations
“…More importantly, our findings, together with recent studies showing effects of adaptation to gaze direction (Calder et al, 2008;Jenkins et al, 2006), emotional expressions (e.g., Bestelmeyer et al, 2010;Hsu & Young, 2004; As mentioned above, previous research on recalibration of perceptions of lip speech has emphasized the effects of recalibration following exposure to auditory stimuli (Eisner & McQueen, 2005;van Linden & Vroomen, 2007;Vroomen & Baart, 2009). By contrast with this emphasis on auditory recalibration, our findings suggest that perceptions of lip speech also can be recalibrated by visual adaptation to different mouth shapes.…”
Section: Discussioncontrasting
confidence: 53%
“…For example, adaptation to a particular expression decreases the likelihood of ambiguous expressions being labeled as showing the adapted emotion (Bestelmeyer et al, 2010;Webster et al, 2004). Similarly, adaptation to faces with left-averted gaze causes faces with subtle left-deviated gaze to be perceived as averting their gaze to the right (Calder et al, 2008;Jenkins et al, 2006). Lip-read speech plays an important role in social interactions, as was demonstrated by the McGurk effect (McGurk & MacDonald, 1976), in which the spoken syllable / / presented in combination with video of a person displaying the mouth shapes associated with the syllable / / induced perceptions of the syllable / / in the majority of participants.…”
mentioning
confidence: 99%
See 2 more Smart Citations
“…A simple opponent coding scheme predicts no change in responses after either type of adaptation. When tested on such a task, eye gaze direction (Calder et al, 2008) and head direction aftereffects (Lawson, Clifford, & Calder, 2011) both followed the pattern of changes predicted by a multichannel code.Burton and colleagues (2015) applied this paradigm to the study of facial expression aftereffects. Adapting to an expressionless face resulted in a robust narrowing of the range of faces participants placed in the middle category, while adapting alternatingly to distinct expressions produced a smaller but significant narrowing of the central category (Burton et al, 2015).…”
mentioning
confidence: 99%