2010
DOI: 10.1093/scan/nsq024
|View full text |Cite
|
Sign up to set email alerts
|

It’s in your eyes—using gaze-contingent stimuli to create truly interactive paradigms for social cognitive and affective neuroscience

Abstract: The field of social neuroscience has made remarkable progress in elucidating the neural mechanisms of social cognition. More recently, the need for new experimental approaches has been highlighted that allow studying social encounters in a truly interactive manner by establishing 'online' reciprocity in social interaction. In this article, we present a newly developed adaptation of a method which uses eyetracking data obtained from participants in real time to control visual stimulation during functional magne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
153
0
4

Year Published

2011
2011
2020
2020

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 148 publications
(162 citation statements)
references
References 40 publications
3
153
0
4
Order By: Relevance
“…In social neuroscience and related fields, there has been discussion on the growing need to define the "social" in social perception and interaction more precisely (De Jaegher, 2009;Kingstone et al, 2008;Teufel et al, 2010;Zaki & Ochsner, 2009) and to take into account the complexities of real-world situations (Kingstone et al, 2008). For instance, one might ask whether the social processes are similar when investigating reciprocal, face-to-face engagement in a computer-mediated context (i.e., Wilms et al, 2010) as opposed to situations when two persons are mutually present, situated physically close to each other. Recent theoretical viewpoints have highlighted the self as an enactive being; perception and movement are closely intertwined, and people are continuously aware of their bodies in relation to external objects (McGann & De Jaegher, 2009;Zahavi, 2002).…”
Section: Discussionmentioning
confidence: 99%
“…In social neuroscience and related fields, there has been discussion on the growing need to define the "social" in social perception and interaction more precisely (De Jaegher, 2009;Kingstone et al, 2008;Teufel et al, 2010;Zaki & Ochsner, 2009) and to take into account the complexities of real-world situations (Kingstone et al, 2008). For instance, one might ask whether the social processes are similar when investigating reciprocal, face-to-face engagement in a computer-mediated context (i.e., Wilms et al, 2010) as opposed to situations when two persons are mutually present, situated physically close to each other. Recent theoretical viewpoints have highlighted the self as an enactive being; perception and movement are closely intertwined, and people are continuously aware of their bodies in relation to external objects (McGann & De Jaegher, 2009;Zahavi, 2002).…”
Section: Discussionmentioning
confidence: 99%
“…facial expressions may also modulate the type and magnitude of amygdala responses (Adams et al, 2003;N'Diaye et al, 2009). Recent studies using interactive eye tracking paradigms (Wilms et al, 2010) in which study participants could actively draw the attention of a virtual partner to a certain object in space (Schilbach et al, 2010b) demonstrated increased responses within the medial prefrontal cortex during joint attention. Furthermore, a linear dependency of MPFC activity and gaze duration has been found (Kuzmanovic et al, 2009).…”
Section: Discussionmentioning
confidence: 99%
“…The participant was instructed to fixate the central point and to keep his/her attention inside the fixation area at the level of the central point during the trial, avoiding eye blinks and saccades (for additional details about instructions, see Conty and Grèzes, 2012). Given the importance of an ecologically valid approach (Zaki and Ochsner, 2009;Schilbach, 2010;Wilms et al, 2010), we kept our design as naturalistic as possible. To do so, an apparent movement was created by the consecutive presentation of two photographs on the screen (Conty et al, 2007).…”
Section: Methodsmentioning
confidence: 99%