2022
DOI: 10.1016/j.tins.2021.12.008
|View full text |Cite
|
Sign up to set email alerts
|

Informing deep neural networks by multiscale principles of neuromodulatory systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 27 publications
(21 citation statements)
references
References 158 publications
0
19
0
Order By: Relevance
“…synaptic plasticity to shape decision making, learning, attention, memory, and motivation [33,36,31,11,35,41]. The multiscale organization of neuromodulatory systems provides high-level contextual control of cognition and behavior, and new approaches to inform and evaluate ANNs can be inspired by integrating them with ANNs [24].…”
Section: Biological Neuromodulation and Artificial Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…synaptic plasticity to shape decision making, learning, attention, memory, and motivation [33,36,31,11,35,41]. The multiscale organization of neuromodulatory systems provides high-level contextual control of cognition and behavior, and new approaches to inform and evaluate ANNs can be inspired by integrating them with ANNs [24].…”
Section: Biological Neuromodulation and Artificial Neural Networkmentioning
confidence: 99%
“…Creating such an array of complex neural networks is generally impractical, and so methods like model-agnostic continual learning [12] instead train two networks -one that encodes a task identity as a non-trivial latent representation, and another that maps that metarepresentation to weights that perform the actual task-specific inference. Here, we propose a novel bio-inspired neuromodulation [2,21,24,27] paradigm that reduces task-specific representational footprints so much that it is practical to store each task representation separately and eliminates the need for non-trivial task encoders for forgetting-free continual learning.…”
Section: Introductionmentioning
confidence: 99%
“…However, this also means that the NMC model represents only what it seeks to model closely: an isolated cortical microcircuit. As such, without the vast majority of its synaptic and neuromodulatory inputs, its activity would not resemble a plausible cortical state (Mei et al, 2022; Lee and Dan, 2012). While we were able to simulate in vivo -like (asynchronous sparse) activity through artificial depolarization and mimicking the effects of extracellular calcium (Markram et al, 2015), we cannot expect more complex emergent phenomena that would require, for example, inter-region connectivity and feedback connections (Felleman and Van Essen, 1991; Shepherd and Yamawaki, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…To further understand how biologically plausible mechanisms may shed light on DNNs and optimization methods, implementations at the dendritic, single-neuron, or microcircuitry levels have been increasingly realized. 1 , 2 , 3 , 4 , 5 Thus far, deep learning and neurorobotics studies 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 have examined whether biological neuromodulation may lead to behavioral benefits. In these studies, neuromodulation was commonly defined as a mechanism that self-reconfigures network hyperparameters and connectivity based on environmental or/and behavioral states of the neural network ( Table 1 ).…”
Section: Introductionmentioning
confidence: 99%
“… 6 , 7 , 8 , 9 The neuromodulatory system consists of the serotonergic (5-HT), dopaminergic (DA), noradrenergic (NA), and cholinergic systems (ACh), which modulate a spectrum of physiological and cognitive processes through highly region- and target-specific projections originating from midbrain, hindbrain, or forebrain areas. 3 , 10 Through neuromodulation-inspired learning, DNNs that employ adaptive learning rules including synaptic plasticity and feedback-based hyperparameter tuning were validated in tasks including spatial learning, cue-reward association, and image classification and recognition ( Table 1 ). As an example, 9 Vecoven et al.…”
Section: Introductionmentioning
confidence: 99%