2013
DOI: 10.3389/fncir.2013.00037
|View full text |Cite
|
Sign up to set email alerts
|

Learning and exploration in action-perception loops

Abstract: Discovering the structure underlying observed data is a recurring problem in machine learning with important applications in neuroscience. It is also a primary function of the brain. When data can be actively collected in the context of a closed action-perception loop, behavior becomes a critical determinant of learning efficiency. Psychologists studying exploration and curiosity in humans and animals have long argued that learning itself is a primary motivator of behavior. However, the theoretical basis of le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
64
2

Year Published

2016
2016
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 72 publications
(72 citation statements)
references
References 55 publications
0
64
2
Order By: Relevance
“…* We assume informational organisms will seek to recalibrate by seeking clean, unambiguous signals in natural environments. That is, they will find such signals attractive and palatable in some informational sense like "interestingness" (Schmidhuber, 2009) or "curiosity" (Little & Sommer, 2013). We map this appetite for clean sensory data to the concept of attractiveness or "palatability" (Lutter & Nestler, 2009;Na, Morris, Johnson, Beltz, & Johnson, 2006).…”
Section: 16mentioning
confidence: 99%
“…* We assume informational organisms will seek to recalibrate by seeking clean, unambiguous signals in natural environments. That is, they will find such signals attractive and palatable in some informational sense like "interestingness" (Schmidhuber, 2009) or "curiosity" (Little & Sommer, 2013). We map this appetite for clean sensory data to the concept of attractiveness or "palatability" (Lutter & Nestler, 2009;Na, Morris, Johnson, Beltz, & Johnson, 2006).…”
Section: 16mentioning
confidence: 99%
“…Through cognitive learning, we are able to improve the perception (e.g., quantification, categorization, spatial differentiation) of tactile information and better interact with our surroundings . Such an ability is also expected to be an important feature in humanoid applications, as it will enable them to adapt to changes to their surroundings and tasks . CNN has connectivity patterns between neurons that resemble the organization of the sensory cortex.…”
mentioning
confidence: 99%
“…[38][39][40] Such an ability is also expected to be an important feature in humanoid applications, as it will enable them to adapt to changes to their surroundings and tasks. [41,42] CNN has connectivity patterns between neurons that resemble the organization of the sensory cortex. Analogously to synaptic strengths and neural receptive fields of the sensory cortex, CNN consists of i) linear operations filtering by a set of weights, [36] ii) a pointwise nonlinear operation, called activation, iii) pooling, a nonlinear Adv.…”
mentioning
confidence: 99%
“…DOI: 10.1103/PhysRevE.95.051301 Often, we wish to find a minimal maximally predictive model consistent with available data. Perhaps we are designing interactive agents that reap greater rewards by developing a predictive model of their environment [1][2][3][4][5][6] or, perhaps, we wish to build a predictive model of experimental data because we believe that the resultant model gives insight into the underlying mechanisms of the system [7,8]. Either way, we are almost always faced with constraints that force us to efficiently compress our data [9].…”
mentioning
confidence: 99%