When we run our fingers over the surface of an object, we acquire information about its microgeometry and material properties. Texture information is widely believed to be conveyed in spatial patterns of activation evoked across one of three populations of cutaneous mechanoreceptive afferents that innervate the fingertips. Here, we record the responses evoked in individual cutaneous afferents in Rhesus macaques as we scan a diverse set of natural textures across their fingertips using a custom-made rotating drum stimulator. We show that a spatial mechanism can only account for the processing of coarse textures. Information about most natural textures, however, is conveyed through precise temporal spiking patterns in afferent responses, driven by high-frequency skin vibrations elicited during scanning. Furthermore, these texture-specific spiking patterns predictably dilate or contract in time with changes in scanning speed; the systematic effect of speed on neuronal activity suggests that it can be reversed to achieve perceptual constancy across speeds. The proposed temporal coding mechanism involves converting the fine spatial structure of the surface into a temporal spiking pattern, shaped in part by the mechanical properties of the skin, and ascribes an additional function to vibration-sensitive mechanoreceptive afferents. This temporal mechanism complements the spatial one and greatly extends the range of tangible textures. We show that a combination of spatial and temporal mechanisms, mediated by all three populations of afferents, accounts for perceptual judgments of texture.spike timing | roughness | touch | psychophysics | neurophysiology O ur exquisite tactile sensitivity to surface texture allows us to distinguish silk from satin, or even good silk from cheap silk. However, the neural basis for our ability to identify individual textures has never been investigated. Natural textures can comprise very fine textural features, on the order of micrometers, but also coarser ones on the order of millimeters. Surface features sized over many orders of magnitude must then be fused to yield a unitary percept of texture. At the coarse extreme of this range, Braille dots and gratings have been shown to be encoded in the spatial pattern of activation elicited in slowly adapting type 1 (SA1) afferents (1-4), which densely innervate the primate fingertip. Specifically, the spatial layout of surface features is reflected in the spatial layout of the SA1 response across the sensory sheet, so information about texture can be read out from this neural image, a mechanism that draws an analogy to vision. The most compelling evidence implicating this spatial mechanism in texture perception stems from an elegant series of studies that demonstrate that one of the major perceptual attributes of a textured surface, its roughness, can be predicted from the spatial pattern of activation it elicits in SA1 afferents (1-3). However, most natural textures comprise features that are too fine to be resolved through a spatially modulate...
The authors note, "For the 'hybrid' location discrimination task, we report data obtained from 27 electrodes, 16 of which were in area 1; the 11 electrodes in area 3b were divided evenly across the two animals (6 and 5). We had previously tested all of the electrodes, including those in area 3b, in the detection and discrimination tasks (as shown in Fig. 3) and found them all to yield approximately equivalent performance (see Fig 3A). We noticed in the hybrid location discrimination task, however, that one of the animals performed much more poorly based on stimulation of area 3b than it did based on stimulation of area 1 (while the other animal performed better based on stimulation of area 1). Having no reason to question any of the arrays, we attributed this discrepancy to differences across animals and arrived at the conclusion, based on pooled data from both animals, that stimulation of the two areas yields equivalent performance in the 'hybrid location discrimination' task. The overall conclusion, then, was that stimulation of neurons in area 3b and 1 evokes percepts that are equally localized on the skin."Shortly after publication of the paper, we repeated detection experiments across the arrays and found that the animal could no longer detect stimulation through the array in area 3b that had yielded poor performance in the hybrid location discrimination task. It is therefore likely that this array had failed between the time we conducted the initial detection and discrimination experiments and the time we conducted the hybrid location discrimination task (which required 2-3 months of retraining). If this is the case, and we eliminate data from that bad array, then the median performance on hybrid trials is 83% (up from the 80% that was originally reported), which is still statistically poorer than that on the location-matched mechanical trials [median difference between performance on mechanical and hybrid trials was 3.3% rather than 5.6%, t (119) = 6.1, P < 0.001] (see the corrected Fig. 2). Thus, we probably underestimated overall performance on hybrid trials, and thus the degree to which artificial percepts are localized, in the original publication. Importantly, however, performance on hybrid trials based on stimulation of area 3b was significantly better than performance based on stimulation of area 1 [median Δp = 0.028 and 0.054 for areas 3b and 1, respectively; t test: t (76) = 2.8, P < 0.01]. Thus, based on the data obtained from only one animal, it seems as though stimulation of area 3b elicits more localized percepts than does stimulation of area 1, as might be expected given that neurons in area 3b tend to have smaller receptive fields than their counterparts in area 1 (1, 2)."As a result of this error, Fig. 2 and its legend appeared incorrectly. The corrected figure and its corresponding legend appear below. On both mechanical and hybrid trials, the relative locations of stimuli applied to widely spaced digits were more accurately discriminated than were the relative locations of stimuli applie...
How specific aspects of a stimulus are encoded at different stages of neural processing is a critical question in sensory neuroscience. In the present study, we investigated the neural code underlying the perception of stimulus intensity in the somatosensory system. We first characterized the responses of SA1 (slowly adapting type 1), RA (rapidly adapting), and PC (Pacinian) afferents of macaque monkeys to sinusoidal, diharmonic, and bandpass noise stimuli. We then had human subjects rate the perceived intensity of a subset of these stimuli. On the basis of these neurophysiological and psychophysical measurements, we evaluated a series of hypotheses about which aspect(s) of the neural activity evoked at the somatosensory periphery account for perception. We evaluated three types of neural codes. The first consisted of population codes based on the firing rate of neurons located directly under the probe. The second included population codes based on the firing rate of the entire population of active neurons. The third included codes based on the number of active afferents. We found that the response evoked in the localized population is logarithmic with stimulus amplitude (given a constant frequency composition), whereas the population response across all neurons is linear with stimulus amplitude. We conclude that stimulus intensity is best accounted for by the firing rate evoked in afferents located under or near the locus of stimulation, weighted by afferent type.
At an early stage of processing, a stimulus is represented as a set of contours. In the representation of form, a critical feature of these local contours is their orientation. In the present study, we investigate the representation of orientation at the somatosensory periphery and in primary somatosensory cortex. We record the responses of mechanoreceptive afferents and of neurons in areas 3b and 1 to oriented bars and edges using a variety of stimulus conditions. We find that orientation is not explicitly represented in the responses of single afferents, but a large proportion of orientation detectors (ϳ50%) can be found in areas 3b and 1. Many neurons in both areas exhibit orientation tuning that is preserved across modes of stimulus presentation (scanned vs indented) and is relatively insensitive to other stimulus parameters, such as amplitude and speed, and to the nature of the stimulus, bar or edge. Orientation-selective neurons tend to be more SA (slowly adapting)-like than RA (rapidly adapting)-like, and the strength of the orientation signal is strongest during the sustained portion of the response to a statically indented bar. The most orientation-selective neurons in SI are comparable in sensitivity with that measured in humans. Finally, responses of SI neurons to bars and edges can be modeled with a high degree of accuracy using Gaussian or Gabor filters. The similarity in the representations of orientation in the visual and somatosensory systems suggests that analogous neural mechanisms mediate early visual and tactile form processing.
In somatosensory cortex, stimulus amplitude is represented at a relatively coarse temporal resolution, while stimulus frequency is represented by precisely timed action potentials.
Sensory systems are designed to extract behaviorally relevant information from the environment. In seeking to understand a sensory system, it is important to understand the environment within which it operates. In the present study, we seek to characterize the natural scenes of tactile texture perception. During tactile exploration complex high-frequency vibrations are elicited in the fingertip skin, and these vibrations are thought to carry information about the surface texture of manipulated objects. How these texture-elicited vibrations depend on surface microgeometry and on the biomechanical properties of the fingertip skin itself remains to be elucidated. Here we record skin vibrations, using a laser-Doppler vibrometer, as various textured surfaces are scanned across the finger. We find that the frequency composition of elicited vibrations is texture specific and highly repeatable. In fact, textures can be classified with high accuracy on the basis of the vibrations they elicit in the skin. As might be expected, some aspects of surface microgeometry are directly reflected in the skin vibrations. However, texture vibrations are also determined in part by fingerprint geometry. This mechanism enhances textural features that are too small to be resolved spatially, given the limited spatial resolution of the neural signal. We conclude that it is impossible to understand the neural basis of texture perception without first characterizing the skin vibrations that drive neural responses, given the complex dependence of skin vibrations on both surface microgeometry and fingertip biomechanics.
Summary Temporal frequency is a fundamental sensory dimension in audition and touch. In audition, analysis of temporal frequency is necessary for speech and music perception [1]; in touch, the spectral analysis of vibratory signals has been implicated in texture perception [2, 3] and in sensing the environment through tools [4–7]. Environmental oscillations impinging upon the ear are generally thought to be processed independently of oscillations impinging upon the skin. Here, we show that frequency channels are perceptually linked across audition and touch. In a series of psychophysical experiments, we demonstrate that auditory stimuli interfere with tactile frequency perception in a systematic manner. Specifically, performance on a tactile frequency-discrimination task is impaired when an auditory distractor is presented with the tactile stimuli, but only if the frequencies of the auditory and tactile stimuli are similar. The frequency-dependent interference effect is observed whether the distractors are pure tones or band-pass noise, so an auditory percept of pitch is not required for the effect to be produced. Importantly, distractors that strongly impair frequency discrimination do not interfere with judgments of tactile intensity. This surprisingly specific crosstalk between different modalities reflects the importance of supramodal representations of fundamental sensory dimensions.
Because tactile perception relies on the response of large populations of receptors distributed across the skin, we seek to characterize how a mechanical deformation of the skin at one location affects the skin at another. To this end, we introduce a novel non-contact method to characterize the surface waves produced in the skin under a variety of stimulation conditions. Specifically, we deliver vibrations to the fingertip using a vibratory actuator and measure, using a laser Doppler vibrometer, the surface waves at different distances from the locus of stimulation. First, we show that a vibration applied to the fingertip travels at least the length of the finger and that the rate at which it decays is dependent on stimulus frequency. Furthermore, the resonant frequency of the skin matches the frequency at which a subpopulation of afferents, namely Pacinian afferents, is most sensitive. We show that this skin resonance can lead to a two-fold increase in the strength of the response of a simulated afferent population. Second, the rate at which vibrations propagate across the skin is dependent on the stimulus frequency and plateaus at 7 m/s. The resulting delay in neural activation across locations does not substantially blur the temporal patterning in simulated populations of afferents for frequencies less than 200 Hz, which has important implications about how vibratory frequency is encoded in the responses of somatosensory neurons. Third, we show that, despite the dependence of decay rate and propagation speed on frequency, the waveform of a complex vibration is well preserved as it travels across the skin. Our results suggest, then, that the propagation of surface waves promotes the encoding of spectrally complex vibrations as the entire neural population is exposed to essentially the same stimulus. We also discuss the implications of our results for biomechanical models of the skin.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.