2019
DOI: 10.1162/neco_a_01228
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Predictive-Coding-Inspired Variational RNN Model for Online Prediction and Recognition

Abstract: This study introduces PV-RNN, a novel variational RNN inspired by the predictivecoding ideas. The model learns to extract the probabilistic structures hidden in fluctuating temporal patterns by dynamically changing the stochasticity of its latent states. Its architecture attempts to address two major concerns of variational Bayes RNNs: how can latent variables learn meaningful representations and how can the inference model transfer future observations to the latent variables. PV-RNN does both by introducing a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
86
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 66 publications
(88 citation statements)
references
References 52 publications
2
86
0
Order By: Relevance
“…Neurons refer to the number of deterministic variables, while Z-units refer to the number of stochastic variables. These are kept in a 10:1 ratio as in Reference [ 25 ]. is the MTRNN time constant, with shorter time constants used in the lower layers which should be more responsive, and longer time constants in the higher layers.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Neurons refer to the number of deterministic variables, while Z-units refer to the number of stochastic variables. These are kept in a 10:1 ratio as in Reference [ 25 ]. is the MTRNN time constant, with shorter time constants used in the lower layers which should be more responsive, and longer time constants in the higher layers.…”
Section: Methodsmentioning
confidence: 99%
“…For the purpose of examining such properties of the trained networks, we conduct a test for target regeneration of the trained trajectories in a manner similar to that originally used in Reference [ 25 ]. In this test, we attempt to regenerate a particular target sequence from the training dataset by using the information of the latent state in the initial step.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations