2016
DOI: 10.1016/j.cobeha.2016.02.027
|View full text |Cite
|
Sign up to set email alerts
|

Synchronization and temporal processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
47
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 56 publications
(57 citation statements)
references
References 81 publications
3
47
0
Order By: Relevance
“…Another interesting finding in our study is the presence of significantly greater power for improv over scale in the alpha frequency range in brain regions including the PMC, STG, IPL, and the TPJ ( Figures 2B,C,H, Table 1) involved with perceptual-motor planning and control as well as feedback regulation based on external and internal states and goals. In music processing, these regions have been found to be involved in manipulations of musical structures (Zatorre et al, 2010), and motor-auditory interactions mediated through the parietal cortex has been suggested to be required for musical rhythm perception and production (Iversen and Balasubramaniam, 2016). These functions are thought to be indicative of processes involved with music improvisation.…”
Section: Brain Related Activity Differentiating Improv and Scalementioning
confidence: 99%
“…Another interesting finding in our study is the presence of significantly greater power for improv over scale in the alpha frequency range in brain regions including the PMC, STG, IPL, and the TPJ ( Figures 2B,C,H, Table 1) involved with perceptual-motor planning and control as well as feedback regulation based on external and internal states and goals. In music processing, these regions have been found to be involved in manipulations of musical structures (Zatorre et al, 2010), and motor-auditory interactions mediated through the parietal cortex has been suggested to be required for musical rhythm perception and production (Iversen and Balasubramaniam, 2016). These functions are thought to be indicative of processes involved with music improvisation.…”
Section: Brain Related Activity Differentiating Improv and Scalementioning
confidence: 99%
“…The auditory stream may be as simple as a metronome or as complex as a highly layered and time-varying musical work, but the human brain seems to almost automatically seek a simple regularity, the beat, or pulse, which can serve to organize our movements (as in dance, or tapping your foot to music), but also can organize our perception of time (Hannon, Snyder, Eerola, & Krumhansl, 2004;Palmer & Krumhansl, 1990). Two types of timing that are involved in rhythm perception are intervalbased (absolute) timing and beat-based (relative) timing (Dalla Bella et al, 2016;Grube, Lee, Griffiths, Barker, & Woodruff, 2010;Iversen & Balasubramaniam, 2016). Interval-based timing is common to humans and non-human primates (Merchant & Honing, 2014;Zarco, Merchant, Prado, & Mendez, 2009).…”
Section: Beat Perceptionmentioning
confidence: 99%
“…A tight relationship between movement and auditory rhythm perception is evident in human motor system response and motor involvement during music listening and rhythm tasks (Iversen & Balasubramaniam, 2016;Janata, Tomic, & Haberman, 2012;Repp, 2005a;Repp, 2005b;Ross, Warlaumont, Abney, Rigoli, & Balasubramaniam, 2016), and can be observed in neural response to music early in infant development (Kuhl, Ramirez, Bosseler, Lin, & Imada, 2014). How we move to music has by itself become a systematic subfield of inquiry (Ross et al, 2016) that often focuses on body synchronization with music.…”
Section: Introductionmentioning
confidence: 99%
“…Together, these results demonstrate how fronto-striatal networks play a crucial role in predicting structure in musical rhythm and linguistic syntax (Kotz et al, 2009). Such observations in music have led to the Action Simulation for Auditory Processing hypothesis (Iversen et al, 2009;Iversen and Balasubramaniam, 2016;Patel and Iversen, 2014) proposing that action processing is recruited during predictive coding of music and language.…”
Section: Predictive Codingmentioning
confidence: 70%