The auditory cortex is critical for perceiving a sound's location. However, there is no topographic representation of acoustic space, and individual auditory cortical neurons are often broadly tuned to stimulus location. It thus remains unclear how acoustic space is represented in the mammalian cerebral cortex and how it could contribute to sound localization. This report tests whether the firing rates of populations of neurons in different auditory cortical fields in the macaque monkey carry sufficient information to account for horizontal sound localization ability. We applied an optimal neural decoding technique, based on maximum likelihood estimation, to populations of neurons from 6 different cortical fields encompassing core and belt areas. We found that the firing rate of neurons in the caudolateral area contain enough information to account for sound localization ability, but neurons in other tested core and belt cortical areas do not. These results provide a detailed and plausible population model of how acoustic space could be represented in the primate cerebral cortex and support a dual stream processing model of auditory cortical processing.auditory cortex ͉ macaque ͉ population model ͉ sound localization S ound localization is a fundamental function of the auditory system in terrestrial vertebrates. Unilateral lesions of the auditory cortex have been shown to produce deficits in contralesional space in a variety of species (1-4), indicating a critical role for auditory cortex. However, despite considerable effort to determine how the cerebral cortex processes acoustic space, our understanding remains rudimentary (e.g., 5-17). From these studies and others, we know that (i) acoustic space is not topographically organized in the mammalian cerebral cortex and (ii) single neuron spatial receptive fields are very broad and, in themselves, unlikely to account for localization ability. Thus, some form of population code is likely used to represent acoustic space in the auditory cortex. Several models have been proposed (e.g., 13, 17), yet they do not illustrate how such a coding scheme could actually work, and it remains unclear which aspects of the neuronal response carry the information. Recent studies in extrastriate visual cortex have shown that a logarithmic maximum-likelihood estimator could account for direction selectivity of visual motion based on the firing rate of populations of neurons (18). The perception of azimuthal acoustic space can be similarly modeled as direction over 360°, and such an estimator may be a common cortical process for encoding secondary stimulus properties.A recent study examining single neuron recordings in the macaque auditory cortex (22) was consistent with the hypothesis put forth by Rauschecker and others (19-21) that acoustic space could be represented in a hierarchical fashion, starting in the core field(s) of auditory cortex and progressing to the belt and para-belt fields. That study showed that spatial tuning of single neurons was sharpest in the caudolateral (CL...