2017
DOI: 10.1073/pnas.1620350114
|View full text |Cite
|
Sign up to set email alerts
|

Visual cortex entrains to sign language

Abstract: Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow (<8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
48
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(52 citation statements)
references
References 57 publications
1
48
0
Order By: Relevance
“…Prior studies have demonstrated that non‐auditory sensory information such as visual and tactile inputs activate the auditory cortex in SNHL (Lomber et al, 2010; Sadato et al, 2005), suggesting substantial plasticity of the auditory cortex during brain development. It may be possible that the auditory cortex in SNHL patients receive atypical inputs from other brain regions such as visuo‐spatial information of sign language (Nishimura et al, 1999; Brookshire et al, 2017) to compensate for the loss of auditory signals.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Prior studies have demonstrated that non‐auditory sensory information such as visual and tactile inputs activate the auditory cortex in SNHL (Lomber et al, 2010; Sadato et al, 2005), suggesting substantial plasticity of the auditory cortex during brain development. It may be possible that the auditory cortex in SNHL patients receive atypical inputs from other brain regions such as visuo‐spatial information of sign language (Nishimura et al, 1999; Brookshire et al, 2017) to compensate for the loss of auditory signals.…”
Section: Discussionmentioning
confidence: 99%
“…It may be possible that the auditory cortex in SNHL patients receive atypical inputs from other brain regions such as visuo-spatial information of sign language (Nishimura et al, 1999;Brookshire et al, 2017) to compensate for the loss of auditory signals.…”
Section: Audio-visual Interactionmentioning
confidence: 99%
“…We obtained the rate of movement (velocity) of gestures using a Frame Differencing Method (FDM; current sampling rate 25 frames per second). FDM utilizes an algorithm that computes the number of pixels that change from frame to frame from a video recording (using Python code made publicly available by Brookshire et al, 2017). This method provides an indication of gross movement through time-and is reliable compared to other methods such as Polhemus or Kinect (see Romero et al, 2017)-, which can be used as an estimate of velocity of hand-gesture movements.…”
Section: Exploratory: Gesture Kinematics Using Frame Differencing Methodsmentioning
confidence: 99%
“…We examined how often gestures were produced that closely mirrored the lifting motion involved in the actual task (e.g., lifting with two hands, rather than one), to see if producing such congruent gestures would be associated with a larger or smaller illusion. We also used a Frame Differencing Method (FDM; Brookshire, Lu, Nusbaum, Goldin-Meadow, & Cassasanto, 2017;Romero et al, 2017) to measure the velocity of two-handed lifting gestures to explore whether participants who report that the objects are heavier would move their hands more slowly as they gestured about lifting them. If gestures reflect such sensorimotor knowledge in their kinematics, this would provide strong evidence that gestures are based in sensorimotor know-how.…”
Section: Does Gesture Strengthen Sensorimotor Knowledge Of Objects? Tmentioning
confidence: 99%
“…In order to exclude the possibility that the general increase in performance at the increasing duration of the stimulation segments might be due to a simple increase in conveyed energy throughout the unfolding of the facial and vocal expressions, we investigated potential correlations between change in energy in our stimuli and change in the performance of the participants. The difference in root mean square (RMS) was measured across gates in auditory files, while the instantaneous visual change (IVC, defined in Brookshire et al 2017) was measured across frames of the visual stimuli for two separate subparts of the face, the eyes and the mouth. The differences in RMS across gates showed no positive correlation with the subjects' performance in the discrimination of vocal expressions across gates.…”
Section: Fig 4 -[A]mentioning
confidence: 99%