The likelihood of rupture of unruptured intracranial aneurysms that were less than 10 mm in diameter was exceedingly low among patients in group 1 and was substantially higher among those in group 2. The risk of morbidity and mortality related to surgery greatly exceeded the 7.5-year risk of rupture among patients in group 1 with unruptured intracranial aneurysms smaller than 10 mm in diameter.
Persistent neural activity is a putative mechanism for the maintenance of working memories. Persistent activity relies on the activity of a distributed network of areas, but the differential contribution of each area remains unclear. We recorded single neurons in the human medial frontal cortex and the medial temporal lobe while subjects held up to three items in memory. We found persistently active neurons in both areas. Persistent activity of hippocampal and amygdala neurons was stimulus-specific, formed stable attractors, and was predictive of memory content. Medial frontal cortex persistent activity, on the other hand, was modulated by memory load and task set but was not stimulus-specific. Trial-by-trial variability in persistent activity in both areas was related to memory strength, because it predicted the speed and accuracy by which stimuli were remembered. This work reveals, in humans, direct evidence for a distributed network of persistently active neurons supporting working memory maintenance.
Memory-based decisions are often accompanied by an assessment of choice certainty, but the mechanisms of such confidence judgments remain unknown. We studied the response of 1065 individual neurons in the human hippocampus and amygdala while neurosurgical patients made memory retrieval decisions together with a confidence judgment. Combining behavioral, neuronal and computational analysis, we identified a population of memory-selective (MS) neurons whose activity signaled stimulus familiarity and confidence as assessed by subjective report. In contrast, the activity of visually selective (VS) neurons was not sensitive to memory strength. The groups further differed in response latency, tuning, and extracellular waveforms. The information provided by MS neurons was sufficient for a race model to decide stimulus familiarity and retrieval confidence. Together, this demonstrates a trial-by-trial relationship between a specific group of neurons and declared memory strength in humans. We suggest that VS and MS neurons are a substrate for declarative memories.
The human amygdala is a key structure for processing emotional facial expressions, but it remains unclear what aspects of emotion are processed. We investigated this question with three different approaches: behavioural analysis of 3 amygdala lesion patients, neuroimaging of 19 healthy adults, and single-neuron recordings in 9 neurosurgical patients. The lesion patients showed a shift in behavioural sensitivity to fear, and amygdala BOLD responses were modulated by both fear and emotion ambiguity (the uncertainty that a facial expression is categorized as fearful or happy). We found two populations of neurons, one whose response correlated with increasing degree of fear, or happiness, and a second whose response primarily decreased as a linear function of emotion ambiguity. Together, our results indicate that the human amygdala processes both the degree of emotion in facial expressions and the categorical ambiguity of the emotion shown and that these two aspects of amygdala processing can be most clearly distinguished at the level of single neurons.
The human amygdala plays a key role in recognizing facial emotions and neurons in the monkey and human amygdala respond to the emotional expression of faces. However, it remains unknown whether these responses are driven primarily by properties of the stimulus or by the perceptual judgments of the perceiver. We investigated these questions by recording from over 200 single neurons in the amygdalae of 7 neurosurgical patients with implanted depth electrodes. We presented degraded fear and happy faces and asked subjects to discriminate their emotion by button press. During trials where subjects responded correctly, we found neurons that distinguished fear vs. happy emotions as expressed by the displayed faces. During incorrect trials, these neurons indicated the patients' subjective judgment. Additional analysis revealed that, on average, all neuronal responses were modulated most by increases or decreases in response to happy faces, and driven predominantly by judgments about the eye region of the face stimuli. Following the same analyses, we showed that hippocampal neurons, unlike amygdala neurons, only encoded emotions but not subjective judgment. Our results suggest that the amygdala specifically encodes the subjective judgment of emotional faces, but that it plays less of a role in simply encoding aspects of the image array. The conscious percept of the emotion shown in a face may thus arise from interactions between the amygdala and its connections within a distributed cortical network, a scheme also consistent with the long response latencies observed in human amygdala recordings.human single unit | medial temporal lobe | limbic system | hippocampus | intracranial
Summary The human amygdala is critical for social cognition from faces, as borne out by impairments in recognizing facial emotion following amygdala lesions [1] and differential activation of the amygdala by faces [2–5]. Single-unit recordings in the primate amygdala have documented responses selective for faces, their identity, or emotional expression [6, 7], yet how the amygdala represents face information remains unknown. Does it encode specific features of faces that are particularly critical for recognizing emotions (such as the eyes), or does it encode the whole face, a level of representation that might be the proximal substrate for subsequent social cognition? We investigated this question by recording from over 200 single neurons in the amygdalae of seven neurosurgical patients with implanted depth electrodes [8]. We found that approximately half of all neurons responded to faces or parts of faces. Approximately 20% of all neurons responded selectively only to the whole face. Although responding most to whole faces, these neurons paradoxically responded more when only a small part of the face was shown compared to when almost the entire face was shown. We suggest that the human amygdala plays a predominant role in representing global information about faces, possibly achieved through inhibition between individual facial features.
Humans can self-monitor errors without explicit feedback, resulting in behavioral adjustments on subsequent trials such as post-error slowing (PES). The error-related negativity (ERN) is a well-established macroscopic scalp EEG correlate of error self-monitoring, but its neural origins and relationship to PES remain unknown. We recorded in the frontal cortex of patients performing a Stroop task and found neurons that track self-monitored errors and error history in dorsal anterior cingulate cortex (dACC) and pre-supplementary motor area (pre-SMA). Both the intracranial ERN (iERN) and error neuron responses appeared first in pre-SMA, and ~50 ms later in dACC. Error neuron responses were correlated with iERN amplitude on individual trials. In dACC, such error neuron-iERN synchrony and responses of error-history neurons predicted the magnitude of PES. These data reveal a human single-neuron correlate of the ERN and suggest that dACC synthesizes error information to recruit behavioral control through coordinated neural activity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.