2022
DOI: 10.48550/arxiv.2201.07372
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Prospective Learning: Back to the Future

Abstract: Research on both natural intelligence (NI) and artificial intelligence (AI) generally assumes that the future resembles the past: intelligent agents or systems (what we call 'intelligence') observe and act on the world, then use this experience to act on future experiences of the same kind. We call this 'retrospective learning'. For example, an intelligence may see a set of pictures of objects, along with their names, and learn to name them. A retrospective learning intelligence would merely be able to name mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 95 publications
(112 reference statements)
0
3
0
Order By: Relevance
“…This characteristic could constrain their application to situations in which the environment is stable and the cost of sampling is low. Since adult exploration is sensitive to environmental complexity, a forward-looking metric like EIG might be particularly suitable to predict behaviors in a more dynamic learning context (Dubey & Griffiths, 2020;Vogelstein et al, 2022).…”
Section: Discussionmentioning
confidence: 99%
“…This characteristic could constrain their application to situations in which the environment is stable and the cost of sampling is low. Since adult exploration is sensitive to environmental complexity, a forward-looking metric like EIG might be particularly suitable to predict behaviors in a more dynamic learning context (Dubey & Griffiths, 2020;Vogelstein et al, 2022).…”
Section: Discussionmentioning
confidence: 99%
“…In cases where this context is imagined, the agent would be able to simulate possible policies, each which respond according to the imagined situation, rather than the current one. This ability to take future goals into account is in line with the "prospective" learning seen in natural agents, as opposed to the largely "retrospective" learning implemented in current AI systems (Vogelstein et al, 2022).…”
Section: Implementing Access Consciousness In Artificial Agentsmentioning
confidence: 81%
“…Our focus here is the extent to which naive models can solve this sub-problem, extracting simple rules from sequences without prior training. This is an unusual setting for deep learning models, in which pretraining is considered crucial even in the context of few-shot learning (Chollet, 2019;Vogelstein et al, 2022).…”
Section: Introductionmentioning
confidence: 99%