This confidence coding is overlaid within a more pronounced pupil dilation that reflects post-decision components that are related to the response itself but not to the decision. We also provide evidence as to how different behavioral states, imposed by task demands, modulate what the pupil is reflecting, presumably showing what the underlying cognitive network is coding for.
Pupillometry, the measure of pupil size and reactivity, has been widely used to assess cognitive processes. As such, changes in pupil size have been shown to correlate with arousal, locomotion, cortical state and decision-making processes. In addition, pupillary responses have been linked to the activity of neuromodulatory systems that modulate attention and perception as the noradrenergic and cholinergic systems. Due to the extent of processes reflected by the pupil, we aimed at resolving pupillary responses in context of behavioral state and task performance while recording pupillary transients of mice performing a vibrotactile two-alternative forced choice task (2-AFC). We show that pre-stimulus pupil size differentiates between states of disengagement from task performance versus active engagement. In addition, when actively engaged, post-stimulus, pupillary dilations for correct responses are larger than for error responses with this difference reflecting response confidence. Importantly, in a delayed 2-AFC task version, we show that even though pupillary transients mainly reflect motor output or reward anticipation following the response of the animal, they also reflect animal decision confidence prior to its response. Finally, in a condition of passive engagement, when stimulus has no task relevance with reward provided automatically, pupillary dilations reflect stimulation and reward being reduced relative to a state of active engagement explained by shifts of attention from task variables. Our results provide further evidence for how pupillary dilations reflect cognitive processes in a task relevant context, showing that the pupil reflects response confidence and baseline pupil size encodes attentiveness rather than general arousal. Significance StatementFor the last 60 years, pupillometry has been used to study various cognitive processes. Among which are mental load, arousal and various decision related components, linking pupil dilations to underlying neuromodulatory systems. Our results provide extensive evidence that in addition to reflecting attentiveness under task performance, pupil dilations also reflect the confidence of the subject in his ensuing response. This confidence coding is overlaid within a more pronounced pupil dilation that reflects motor output or other post-decision components such that are related to the response itself but not to the decision. Our results also provide evidence how different behavioral states, imposed by task demands, modulate what the pupil is reflecting, presumably showing what the underlying cognitive network is coding for.
In the last 15 years, virtual realities have revolutionized behavior experiments in particular for rodents. In combination with treadmills, running wheels, or air-floating balls, the implementation of a virtual reality (VR) provides not only the opportunity to simultaneously explore behavior and neuronal activity in head-fixed animals under nearly natural conditions, but also allows full control over the visual sensory input presented to the animal. Furthermore, VRs can be combined with other sensory modalities such as auditory, tactile or olfactory stimuli. Despite the power of using VRs in animal experiments, available software packages are very limited, expensive and lack the required flexibility to design appropriate behavior and neurophysiology experiments. For this reason, we have developed the versatile, adaptable and easy to use VR environment MazeMaster, an open-source, Python-based software package for controlling virtual reality setups and behavior experiments. The software package includes a graphical user interface (GUI) and can be integrated into standard electrophysiology and imaging setups even by non-programmers. Ready-made behavioral experiments such as multisensory discrimination in T-mazes are already implemented including full control for reward supply and bias correction. For more individual setup designs, the modularity of MazeMaster allows more programming-affine users to extend the software with potentially missing features. With MazeMaster, we offer a free and easy-to-use VR controller that will facilitate the implementation of VR setups in scientific laboratories. In addition, MazeMaster allows the design and control of common head-fixed rodent behavior paradigms with extensive acquisition of meta-data required for reproducible VR experiments. The MazeMaster VR package, therefore, offers a collaboration tool for reproducible research within and across neuroscience laboratories according to the FAIR principles.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.