2017
DOI: 10.1038/srep46413
|View full text |Cite
|
Sign up to set email alerts
|

Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration

Abstract: Are sight and sound out of synch? Signs that they are have been dismissed for over two centuries as an artefact of attentional and response bias, to which traditional subjective methods are prone. To avoid such biases, we measured performance on objective tasks that depend implicitly on achieving good lip-synch. We measured the McGurk effect (in which incongruent lip-voice pairs evoke illusory phonemes), and also identification of degraded speech, while manipulating audiovisual asynchrony. Peak performance was… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
23
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(25 citation statements)
references
References 79 publications
2
23
0
Order By: Relevance
“…Although understudied, idiosyncratic biases are ubiquitous in behavior (Matthews & Meck, 2014;Kosovicheva & Whitney, 2017;Ipser, et al, 2017). Their origin is largely unknown but the persistency of these biases over time implies that they arise from structural or functional characteristics of an individual's brain (Kanai & Rees, 2011).…”
Section: Discussionmentioning
confidence: 99%
“…Although understudied, idiosyncratic biases are ubiquitous in behavior (Matthews & Meck, 2014;Kosovicheva & Whitney, 2017;Ipser, et al, 2017). Their origin is largely unknown but the persistency of these biases over time implies that they arise from structural or functional characteristics of an individual's brain (Kanai & Rees, 2011).…”
Section: Discussionmentioning
confidence: 99%
“…The minimum threshold for detection of cross-modal asynchrony involving visual and either auditory or somatosensory stimuli is about 22 -30 ms (Spence, Baddeley, Zampini, James, & Shore, 2003;Spence et al, 2001;. However, under some circumstances, asynchrony between visual and auditory input must be as great as 200 ms before it becomes noticeable (Dixon & Spitz, 1980), and unnoticed but substantial temporal mismatches between auditory and visual stimuli seem to be common (Freeman et al, 2013;Ipser, Agolli, Bajraktari, Al-Alawi, Djaafara, & Freeman, 2017). If that has any validity as a guide to asynchrony detection in general, then the perceived present could lag behind information processing in the dorsal stream by as much as 200 ms before it would be noticeable, though the evidence suggests that shorter asynchronies can be detected at least some of the time.…”
Section: An Alternative Hypothesis: the Short Lagmentioning
confidence: 99%
“…There is also evidence that the brain compensates for differences in arrival time of stimuli if the visual stimulus renders the arrival time of the auditory stimulus predictable (Petrini, Russell, & Pollick, 2009). However, sensitivity to asynchronous cross-modal atimuli may be quite limited (Freeman, Ipser, Palmbaha, Paunoiu, Brown, Lambert, Leff, & Driver, 2013;Ipser et al, 2017).…”
Section: Footnotesmentioning
confidence: 99%
“…For a small fraction of individuals, the order of events implied by the PSS values was opposite for onset and for offset conditions. This latter finding is reminiscent of the negative correlation between temporal order performance on a McGurk task and a TOJ task using visual and auditory stimuli presented asynchronously [45,46]. Another striking observation was that the average PSS (77.3 ± 14.4 ms) during onset judgments was significantly higher than the average PSS (30.2 ± 13.6 ms) for offset judgments (t(53) = 3.1, p = 0.003, and BF 10 = 10.324).…”
Section: Pss Estimatesmentioning
confidence: 83%