2009
DOI: 10.1073/pnas.0901077106
|View full text |Cite
|
Sign up to set email alerts
|

Neural computations underlying action-based decision making in the human brain

Abstract: Action-based decision making involves choices between different physical actions to obtain rewards. To make such decisions the brain needs to assign a value to each action and then compare them to make a choice. Using fMRI in human subjects, we found evidence for action-value signals in supplementary motor cortex. Separate brain regions, most prominently ventromedial prefrontal cortex, were involved in encoding the expected value of the action that was ultimately taken. These findings differentiate two main fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

35
214
3
1

Year Published

2010
2010
2018
2018

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 266 publications
(259 citation statements)
references
References 60 publications
35
214
3
1
Order By: Relevance
“…Another interpretation of the links between spatial learning biases and hemisphere-specific reinforcement learning mechanisms, as we report here, can be derived from research on multieffector action reinforcement which posits that different actions are learned and associated to the effectors performing them (Daw, 2014;Madlon-Kay et al, 2013;Wunderlich et al, 2009). For example, PEs for actions performed by the right or left hand are predominantly represented in the contralateral ventral striatum (Gershman et al, 2009;Madlon-Kay et al, 2013;Palminteri et al, 2009).…”
Section: Biased Neural Reinforcement Learning Relates To Spatial Learmentioning
confidence: 81%
See 2 more Smart Citations
“…Another interpretation of the links between spatial learning biases and hemisphere-specific reinforcement learning mechanisms, as we report here, can be derived from research on multieffector action reinforcement which posits that different actions are learned and associated to the effectors performing them (Daw, 2014;Madlon-Kay et al, 2013;Wunderlich et al, 2009). For example, PEs for actions performed by the right or left hand are predominantly represented in the contralateral ventral striatum (Gershman et al, 2009;Madlon-Kay et al, 2013;Palminteri et al, 2009).…”
Section: Biased Neural Reinforcement Learning Relates To Spatial Learmentioning
confidence: 81%
“…These findings, together with findings highlighting a crucial role for DA in activational aspects of behaviour Everitt, 1992, 2007), have led researchers to suggest that the impact of motivation on behaviour is mediated by DA (Chakravarthy et al, 2010;Salamone and Correa, 2012;Wise, 2004). Several studies have also shown that the same neural circuitry may also represent the expected value of stimuli presented to one hemifield, but largely restricted to the contralateral hemisphere (Gershman et al, 2009;Palminteri et al, 2009;Wunderlich et al, 2009), suggesting the possibility of unilaterally activating the brain's motivational system. This notion was recently confirmed in a study where unilateral deep brain stimulation of the subthalamic nucleus, a procedure mimicking the impact of DA enhancers, boosted motivational processes within the stimulated hemisphere, as indicated by greater exerted force for the hand controlled by the stimulated hemisphere following increased monetary incentives (Palminteri et al, 2013).…”
Section: Biased Computational Reinforcement Learning Relates To Spatimentioning
confidence: 96%
See 1 more Smart Citation
“…Note that the prediction was that these chosen value signals would be observed before any action-related information was made available to the subjects, thus making it unlikely that they would be able to process the choice in the motor system. Based on several previous fMRI studies, we expected to see a neuronal representation of the chosen value in ventromedial prefrontal cortex (vmPFC) (12,25,26). If our hypothesis is correct, it will provide unique evidence that the brain can compute choices solely in goods space.…”
mentioning
confidence: 94%
“…Although this action based model of making decisions may seem convoluted, it is in fact the predominant view among many decision neuroscientists who have found value signals in areas of the brain known to be involved in representing and planning movements such as lateral parietal and premotor cortices (8)(9)(10)(11). Further evidence for the view that decisions are computed by a comparison between actions comes from the finding of action value signals in several regions of the brain, including the caudate nucleus (5,6), supplementary motor cortex (12), and action-related value signals in lateral intraparietal cortex (9,13). Additionally, the action-based model is sometimes presented as a more general psychological model of behavior because it builds more or less directly on theories of reinforcement learning (RL) and seems to provide a flexible and adaptable unitary model for universal problem solving.…”
mentioning
confidence: 99%