2018
DOI: 10.1007/978-3-319-92007-8_11
|View full text |Cite
|
Sign up to set email alerts
|

Cognition-Based Deep Learning: Progresses and Perspectives

Abstract: The human brain is composed of multiple modular subsystems, with a unique way interacting among. These subsystems have their own unique characteristics and interact to support cognitive functions such as memory, attention and cognitive control. Nowadays, deep learning methods based on the above-mentioned functions accompanied with knowledge are widely used to design more dynamic, robust and powerful systems. We first review and summarize the progresses of cognition-based deep neural networks, and how cognitive… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 35 publications
0
1
0
Order By: Relevance
“…Considering the speed of the model, we do not do upsampling, deconvolution and other related operations, but proceed from cognition, hoping to get a more accurate and efficient feature extraction framework. Since memory and attention mechanism is the important parts of cognitive science [14], we use the basic ideas of attention and memory for reference in the construction of the model, and propose a new Recurrent Attentive Neural Network, called RANN.…”
Section: B Recurrent Attentive Neural Networkmentioning
confidence: 99%
“…Considering the speed of the model, we do not do upsampling, deconvolution and other related operations, but proceed from cognition, hoping to get a more accurate and efficient feature extraction framework. Since memory and attention mechanism is the important parts of cognitive science [14], we use the basic ideas of attention and memory for reference in the construction of the model, and propose a new Recurrent Attentive Neural Network, called RANN.…”
Section: B Recurrent Attentive Neural Networkmentioning
confidence: 99%