2022
DOI: 10.48550/arxiv.2201.08813
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Active Predictive Coding Networks: A Neural Solution to the Problem of Learning Reference Frames and Part-Whole Hierarchies

Abstract: We introduce Active Predictive Coding Networks (APCNs), a new class of neural networks that solve a major problem posed by Hinton and others in the fields of artificial intelligence and brain modeling: how can neural networks learn intrinsic reference frames for objects and parse visual scenes into part-whole hierarchies by dynamically allocating nodes in a parse tree? APCNs address this problem by using a novel combination of ideas: (1) hypernetworks are used for dynamically generating recurrent neural networ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…The DPC model can be augmented to incorporate actions, e.g., by augmenting the higher-level state with a higher-level action input to predict the dynamics of lower-level sensory consequences of the action. Such a model also opens the door to using probabilistic inference for hierarchical planning [18,[51][52][53], where actions are selected to minimize the sensory prediction errors with respect to preferred goal states (see also [54]). Such a model could also potentially be used to understand the neural basis of navigation and planning [10,[55][56][57] as an emergent property of prediction error minimization.…”
Section: Discussionmentioning
confidence: 99%
“…The DPC model can be augmented to incorporate actions, e.g., by augmenting the higher-level state with a higher-level action input to predict the dynamics of lower-level sensory consequences of the action. Such a model also opens the door to using probabilistic inference for hierarchical planning [18,[51][52][53], where actions are selected to minimize the sensory prediction errors with respect to preferred goal states (see also [54]). Such a model could also potentially be used to understand the neural basis of navigation and planning [10,[55][56][57] as an emergent property of prediction error minimization.…”
Section: Discussionmentioning
confidence: 99%
“…The lower-level state neurons (at level i) maintain an estimate of the lower-level state s i t,τ where τ denotes a time step at the lower level within the higher level time interval given by t. In the two-level network, this lower-level state makes a prediction of the input via a "decoder" network D. In the simplest case where D is a linear matrix U , this lowest level of APC is equivalent to the generative model using in sparse coding (I = U s where s is sparse [49]). More generally, D can be a 1-layer RELU network [48] or a multi-layer decoder [11,12].…”
Section: Modulation Of State Network By Feedback To Model Complex Env...mentioning
confidence: 99%
“…In the simplest case where D is a linear matrix U , this lowest level of APC is equivalent to the generative model using in sparse coding ( I = Us where s is sparse [49]). More generally, D can be a 1-layer RELU network [48] or a multi-layer decoder [11, 12].…”
Section: Hierarchical Active Predictive Codingmentioning
confidence: 99%
See 2 more Smart Citations