2012
DOI: 10.3389/fnhum.2012.00083
|View full text |Cite
|
Sign up to set email alerts
|

MEG dual scanning: a procedure to study real-time auditory interaction between two persons

Abstract: Social interactions fill our everyday life and put strong demands on our brain function. However, the possibilities for studying the brain basis of social interaction are still technically limited, and even modern brain imaging studies of social cognition typically monitor just one participant at a time. We present here a method to connect and synchronize two faraway neuromagnetometers. With this method, two participants at two separate sites can interact with each other through a stable real-time audio connec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
57
0
4

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 61 publications
(63 citation statements)
references
References 37 publications
(48 reference statements)
2
57
0
4
Order By: Relevance
“…In particular, future research will have to find ways of quantifying interpersonal coordination in order to relate such measurements to neuroimaging data. In this respect, it has been suggested to extend the neuroscientific investigation of interpersonally relevant gaze behavior to dyads by using live video feeds (e.g., Baess et al, 2012;Dumas, Nadel, Soussignan, Martinerie, & Garnero, 2010;Guionnet et al, 2012;Redcay et al, 2010;Saito et al, 2010). While this is a feasible option, it does come with the disadvantage of a reduction of experimental control.…”
Section: Discussionmentioning
confidence: 96%
“…In particular, future research will have to find ways of quantifying interpersonal coordination in order to relate such measurements to neuroimaging data. In this respect, it has been suggested to extend the neuroscientific investigation of interpersonally relevant gaze behavior to dyads by using live video feeds (e.g., Baess et al, 2012;Dumas, Nadel, Soussignan, Martinerie, & Garnero, 2010;Guionnet et al, 2012;Redcay et al, 2010;Saito et al, 2010). While this is a feasible option, it does come with the disadvantage of a reduction of experimental control.…”
Section: Discussionmentioning
confidence: 96%
“…The data were collected during a two-person magnetoencephalography (MEG) experiment, using a MEG2MEG setup (Baess et al, 2012) but only the behavioral results will be reported here. Participants were seated in separate rooms and, depending on the task condition, they had either an audio-only connection (microphones and headphones), or an audiovisual connection where they could also see a video feed of the other participant in natural size on a projection screen positioned 1 m in front of them.…”
Section: Methods Participants Apparatus Materialsmentioning
confidence: 99%
“…Hari et al [54] recently dissected the different levels of brain imaging of social cognition and interaction into single-person studies ('one-person neuroscience', 1PN) and two-person set-ups ('two-person neuroscience', 2PN [2,55,56]). The 1PN set-ups have evolved from presentation of well-defined artificial stimuli (such as chessboard patterns and isolated tones), to the use of complex social stimuli, such as faces or body postures, and then finally to presentation of dynamic stimuli, such as movies.…”
Section: From One-person To Two-person Neurosciencementioning
confidence: 99%
“…Whether these two-person settings should be used despite their complexity depends on the timing of the studied interaction [54]: all two-person studies where the interaction is slow, such as text messaging or playing an economical-decision game, can be replaced by clever pseudo-hyperscanning set-ups where the two persons are alternatingly subjected to brain scanning. However, during conversation, for example, the turn-taking takes such a short time that the interaction can be captured only in simultaneous time-sensitive 2PN recordings [56].…”
Section: From One-person To Two-person Neurosciencementioning
confidence: 99%