2020
DOI: 10.3389/fncom.2020.00029
|View full text |Cite|
|
Sign up to set email alerts
|

Attention in Psychology, Neuroscience, and Machine Learning

Abstract: Attention is the important ability to flexibly control limited computational resources. It has been studied in conjunction with many other topics in neuroscience and psychology including awareness, vigilance, saliency, executive control, and learning. It has also recently been applied in several domains in machine learning. The relationship between the study of biological attention and its use as a tool to enhance artificial neural networks is not always clear. This review starts by providing an overview of ho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
105
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 182 publications
(106 citation statements)
references
References 142 publications
0
105
0
1
Order By: Relevance
“…In that regard, we point out that "selective attention" is often understood by researchers studying visual search as a source of performance limitations, while "attention" is usually understood by researchers in deep learning as some sort of learned weighting that results in performance improvements. Interdisciplinary studies reconciling these apparently conflicting definitions of attention might be fruitful (Lindsay, 2020;Marblestone et al, 2016), by using tasks like those we have presented here. In contrast to visual search, researchers studying object recognition assert that it occurs through an untangling mechanism (DiCarlo et al, 2012;DiCarlo & Cox, 2007).…”
Section: Discussionmentioning
confidence: 98%
See 3 more Smart Citations
“…In that regard, we point out that "selective attention" is often understood by researchers studying visual search as a source of performance limitations, while "attention" is usually understood by researchers in deep learning as some sort of learned weighting that results in performance improvements. Interdisciplinary studies reconciling these apparently conflicting definitions of attention might be fruitful (Lindsay, 2020;Marblestone et al, 2016), by using tasks like those we have presented here. In contrast to visual search, researchers studying object recognition assert that it occurs through an untangling mechanism (DiCarlo et al, 2012;DiCarlo & Cox, 2007).…”
Section: Discussionmentioning
confidence: 98%
“…These have also been described as models of visual selective attention (Eckstein, 1998; Geisler and Cormack, 2011; Wolfe and Horowitz, 2017; Peelen and Kastner, 2014). However, we defer usage of the term “attention” until the discussion, to place emphasis on which specific computations impose limits on search behavior (Hommel et al, 2019), and to avoid overloading the term, since “attention” has a different meaning when applied to the artificial neural network models that we test here (Lindsay, 2020; Hommel et al, 2019). The dominant models explaining limitations are built on results obtained by measuring search behavior with highly simplified stimuli like those shown in the upper left of Fig.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…To address this issue, we resort to the attention mechanism and propose the A-kNN algorithm for the underwater AMC task. As a cognitive process of selectively concentrating on a few features while ignoring others, the attention mechanism can help ML models assign different weights to each part of the input, extract more critical and important information, and make more accurate judgments without incurring more costs to model computation and storage [30,31].…”
Section: Classification Algorithm For Ml-based Amcmentioning
confidence: 99%