Orchestrating a movement towards a sensory target requires many computational processes, including a transformation between reference frames. This transformation is important because the reference frames in which sensory stimuli are encoded often differ from those of motor effectors. The posterior parietal cortex has an important role in these transformations. Recent work indicates that a significant proportion of parietal neurons in two cortical areas transforms the sensory signals that are used to guide movements into a common reference frame. This common reference frame is an eye-centred representation that is modulated by eye-, head-, body- or limb-position signals. A common reference frame might facilitate communication between different areas that are involved in coordinating the movements of different effectors. It might also be an efficient way to represent the locations of different sensory targets in the world.
The fundamental perceptual unit in hearing is the ‘auditory object’. Similar to visual objects, auditory objects are the computational result of the auditory system's capacity to detect, extract, segregate and group spectrotemporal regularities in the acoustic environment; the multitude of acoustic stimuli around us together form the auditory scene. However, unlike the visual scene, resolving the component objects within the auditory scene crucially depends on their temporal structure. Neural correlates of auditory objects are found throughout the auditory system. However, neural responses do not become correlated with a listener's perceptual reports until the level of the cortex. The roles of different neural structures and the contribution of different cognitive states to the perception of auditory objects are not yet fully understood.
The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be modulated by the locations of visual targets, 19% by auditory targets, and 9% by both visual and auditory targets. The reference frame for both visual and auditory receptive fields ranged along a continuum between eye- and head-centered reference frames with approximately 10% of auditory and 33% of visual neurons having receptive fields that were more consistent with an eye- than a head-centered frame of reference and 23 and 18% having receptive fields that were more consistent with a head- than an eye-centered frame of reference, leaving a large fraction of both visual and auditory response patterns inconsistent with both head- and eye-centered reference frames. The results were similar to the reference frame we have previously found for auditory stimuli in the inferior colliculus and core auditory cortex. The correspondence between the visual and auditory receptive fields of individual neurons was weak. Nevertheless, the visual and auditory responses were sufficiently well correlated that a simple one-layer network constructed to calculate target location from the activity of the neurons in our sample performed successfully for auditory targets even though the weights were fit based only on the visual responses. We interpret these results as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion.
The neural computations that underlie the processing of auditory-stimulus identity are not well understood, especially how information is transformed across different cortical areas. Here, we compared the capacity of neurons in the superior temporal gyrus (STG) and the ventrolateral prefrontal cortex (vPFC) to code the identity of an auditory stimulus; these two areas are part of a ventral processing stream for auditory-stimulus identity. Whereas the responses of neurons in both areas are reliably modulated by different vocalizations, STG responses code significantly more vocalizations than those in the vPFC. Together, these data indicate that the STG and vPFC differentially code auditory identity, which suggests that substantial information processing takes place between these two areas. These findings are consistent with the hypothesis that the STG and the vPFC are part of a functional circuit for auditory-identity analysis.
Auditory perceptual decisions are thought to be mediated by the ventral auditory pathway. However, the specific and causal contributions of different brain regions in this pathway, including the middle-lateral (ML) and anterolateral (AL) belt regions of the auditory cortex, to auditory decisions have not been fully identified. To identify these contributions, we recorded from and microstimulated ML and AL sites while monkeys decided whether an auditory stimulus contained more low-frequency or high-frequency tone bursts. Both ML and AL neural activity was modulated by the frequency content of the stimulus. However, only the responses of the most stimulus-sensitive AL neurons were systematically modulated by the monkeys’ choices. Consistent with this observation, microstimulation of AL—but not ML—systematically biased the monkeys’ behavior toward the choice associated with the preferred frequency of the stimulated site. Together, these findings suggest that AL directly and causally contributes sensory evidence used to form this auditory decision.
Tsunada J, Lee JH, Cohen YE. Representation of speech categories in the primate auditory cortex.
What is presently unclear is whether this cortical area also plays a role in spontaneous recognition and discrimination of natural categories. Here, we explore this possibility by recording from neurons in the PFC while rhesus listen to species-specific vocalizations that vary in terms of their social function and acoustic morphology. We found that ventral prefrontal cortex (vPFC) activity, on average, did not differentiate between food calls that were associated with the same functional category, despite having different acoustic properties. In contrast, vPFC activity differentiated between food calls associated with different functional classes and specifically, information about the quality and motivational value of the food. These results suggest that the vPFC is involved in the categorization of socially meaningful signals, thereby both extending its previously conceived role in the acquisition of learned categories and showing the significance of using natural categorical distinctions in the study of neural mechanisms. &
tion is one of the fundamental components of both human and nonhuman animal behavior. Auditory communication signals (i.e., vocalizations) are especially important in the socioecology of several species of nonhuman primates such as rhesus monkeys. In rhesus, the ventrolateral prefrontal cortex (vPFC) is thought to be part of a circuit involved in representing vocalizations and other auditory objects. To further our understanding of the role of the vPFC in processing vocalizations, we characterized the spectrotemporal features of rhesus vocalizations, compared these features with other classes of natural stimuli, and then related the rhesus-vocalization acoustic features to neural activity. We found that the range of these spectrotemporal features was similar to that found in other ensembles of natural stimuli, including human speech, and identified the subspace of these features that would be particularly informative to discriminate between different vocalizations. In a first neural study, however, we found that the tuning properties of vPFC neurons did not emphasize these particularly informative spectrotemporal features. In a second neural study, we found that a first-order linear model (the spectrotemporal receptive field) is not a good predictor of vPFC activity. The results of these two neural studies are consistent with the hypothesis that the vPFC is not involved in coding the first-order acoustic properties of a stimulus but is involved in processing the higher-order information needed to form representations of auditory objects. I N T R O D U C T I O NCommunication is one of the fundamental components of both human and nonhuman animal behavior (Hauser 1997). Although the benefits and importance of language in human evolution are obvious (Carruthers 2002;Hauser 1997;Lieberman 2002), other nonhuman communication systems are also important. These communication systems are important because for most, if not all, species they are critical to the species' survival (Andersson 1996;Bennett et al. 1997;Greenfield 2002;Hauser 1997;Lau et al. 2000;Mech and Boitani 2003).For example, auditory communication signals (i.e., speciesspecific vocalizations) are especially important in the socioecology of several species of nonhuman primates (Cheney and Seyfarth 1985;Eimas 1994;Eimas et al. 1971;Hauser 1997;Jusczyk 1997;Jusczyk et al. 1983; Miller and Eimas 1995), such as rhesus monkeys (Macaca mulatta). Vocalizations convey information about the identity and the age of the caller and often provide information about sex and emotional or motivational state (Cheney and Seyfarth 1990; Hauser 1997). Some vocalizations transmit information about objects and events in the environment (Gifford 3rd et al. 2003;Hauser 1998;Seyfarth and Cheney 2003).In rhesus monkeys, the ventrolateral prefrontal cortex (vPFC) plays an important role in processing vocalizations (Hackett et al. 1999;Romanski and Goldman-Rakic 2002;Romanski et al. 1999 Romanski et al. , 2005. The vPFC is thought to be part of a circuit involved in representing au...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.