2014
DOI: 10.5963/ijcsai0401001
|View full text |Cite
|
Sign up to set email alerts
|

Towards a General Attention Mechanism for Embedded Intelligent Systems

Abstract: Abstract-In the domain of intelligent systems the management of system resources is typically called "attention". Attention mechanisms exist because even environments of moderate complexity are a source of vastly more information than available cognitive resources of any known intelligence can handle. Cognitive resource management has not been of much concern in artificial intelligence (AI) work that builds relatively simple systems for particular targeted problems. For systems capable of a wide range of actio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 14 publications
(13 reference statements)
0
3
0
Order By: Relevance
“…In the Auto-catalytic Endogenous Reflective Architecture (AERA) attention is implemented as system-permeating control of computational/cognitive resources at very fine-grain levels of processing, bounded by goals at one end and the current situation at the other (cf. Helgason et al, 2013 ; Nivel et al, 2015 ). Studies on multitasking in humans have shown that a degree of parallelism among multiple tasks is more likely if the tasks involve different data modalities, such as linguistic and tactile.…”
Section: Related Work and Attention's Role In Cognitionmentioning
confidence: 99%
“…In the Auto-catalytic Endogenous Reflective Architecture (AERA) attention is implemented as system-permeating control of computational/cognitive resources at very fine-grain levels of processing, bounded by goals at one end and the current situation at the other (cf. Helgason et al, 2013 ; Nivel et al, 2015 ). Studies on multitasking in humans have shown that a degree of parallelism among multiple tasks is more likely if the tasks involve different data modalities, such as linguistic and tactile.…”
Section: Related Work and Attention's Role In Cognitionmentioning
confidence: 99%
“…Human attention is a built-in mechanism for deciding how to apply their brainpower from moment to moment, (e.g., decide where to see in saliency visual object detection [27]). Attention mechanism is a reasonably well studied subject within the field of cognitive psychology and is known to be a key feature of human artificial intelligence [28]. Nowadays, attention-based deep learning methods are active especially in dealing problems concerning sequence prediction or control, including object detection, natural language processing, and deep reinforcement learning.…”
Section: Attention Mechanism Applied To Deep Learningmentioning
confidence: 99%
“…3) Model Spaces: The necessity to maintain highdimensional representations of the state space X poses a major challenge for current approaches to learning [51], [52], [22] and general problem solving [53], [54]. The method closest to ours in its formalism seems to be that of [29] and even lends itself to learning by a connectionist network [55] -but still requires an exponentially large representation for planning purposes.…”
Section: Overview and Related Literaturementioning
confidence: 99%