2018
DOI: 10.1177/1747021817741611
|View full text |Cite
|
Sign up to set email alerts
|

Visual-auditory differences in duration discrimination depend on modality-specific, sensory-automatic temporal processing: Converging evidence for the validity of the Sensory-Automatic Timing Hypothesis

Abstract: The Sensory-Automatic Timing Hypothesis assumes visual-auditory differences in duration discrimination to originate from sensory-automatic temporal processing. Although temporal discrimination of extremely brief intervals in the range of tens-of-milliseconds is predicted to depend mainly on modality-specific, sensory-automatic temporal processing, duration discrimination of longer intervals is predicted to require more and more amodal, higher order cognitive resources and decreasing input from the sensory-auto… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(18 citation statements)
references
References 100 publications
0
16
2
Order By: Relevance
“…The present Weber Fractions are considerably better than those (between 15 and 30%, with high between-participant variability) reported for the discrimination of different gravity levels (between 0.7 g and 1.3 g) for virtual spheres approaching frontally on parabolic trajectories viewed stereoscopically 37 . They are also better than those (> 20%) reported for speed change discrimination of motion in depth 28 , or those (> 17%) reported for duration discrimination of static stimuli with the same duration (1 s) as the present reference stimuli 38 , 39 .…”
Section: Discussioncontrasting
confidence: 63%
“…The present Weber Fractions are considerably better than those (between 15 and 30%, with high between-participant variability) reported for the discrimination of different gravity levels (between 0.7 g and 1.3 g) for virtual spheres approaching frontally on parabolic trajectories viewed stereoscopically 37 . They are also better than those (> 20%) reported for speed change discrimination of motion in depth 28 , or those (> 17%) reported for duration discrimination of static stimuli with the same duration (1 s) as the present reference stimuli 38 , 39 .…”
Section: Discussioncontrasting
confidence: 63%
“…These performance benefits were observed in both audition and vision as the first modality, consistent with previous studies using the same paradigm (Kang:2018ee;Kang et al, 2017;Gold et al, 2014;Bale et al, 2017), and regarded as an index of learning for RefRP. A qualitatively similar performance improvement in both modalities was achieved by controlling presented temporal patterns at a low pulse rate with a fixed minimum interval (Rammsayer et al, 2015;Rammsayer and Pichelmann, 2018).…”
Section: Discussionmentioning
confidence: 69%
“…Measure-level information in music occurs at a slower timescale (hundreds-of-milliseconds-to-seconds range) than the beat-level timescale (few hundreds of milliseconds). There is increasing evidence that temporal processing in the tens to hundreds of milliseconds range may occur in sensory-specific areas, while longer timing windows from the hundreds to thousands of milliseconds may occur in a modality-general or sensory-overlapping timing system (such as secondary auditory cortex) with input from sensoryspecific areas (Buhusi & Meck, 2005;Ivry & Schlerf, 2008;Karmakar & Buonomano, 2007;Lewis & Miall, 2003;Merchant, Zarco, & Prado, 2008;Nani et al, 2019;Paton & Buonomano, 2018;Rammsayer & Pichelmann, 2018;Stauffer, Haldemann, Troche, & Rammsayer, 2012;van Wassenhove, 2009). Motor areas of the brain, including the supplementary motor area (SMA), pre-SMA, premotor cortex, and basal ganglia, are consistently implicated in beat perception and sensorimotor synchronization tasks (Chapin et al, 2010;Chen, Penhune, & Zatorre, 2008;Grahn & Brett, 2007;Grahn & Rowe, 2009; for both auditory and visual stimuli.…”
Section: Discussionmentioning
confidence: 99%