2020
DOI: 10.1162/neco_a_01311
|View full text |Cite
|
Sign up to set email alerts
|

A Predictive-Coding Network That Is Both Discriminative and Generative

Abstract: Predictive coding (PC) networks are a biologically interesting class of neural networks. Their layered hierarchy mimics the reciprocal connectivity pattern observed in the mammalian cortex, and they can be trained using local learning rules that approximate backpropagation (Bogacz, 2017). However, despite having feedback connections that enable information to flow down the network hierarchy, discriminative PC networks are not typically generative. Clamping the output class and running the network to equilibriu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 19 publications
(43 reference statements)
0
7
0
Order By: Relevance
“…PC is, however, a generative model, as suggested by its formulation as variational inference (K. Friston, 2005;Rao & Ballard, 1999). This implies that the PC models surveyed in the previous section can also be used for data generation from labels, as long as certain regularizations are applied due to the ill-posed nature of the inverse problem (Sun & Orchard, 2020). Additionally, PCNs can also be used directly as generative models due to their interpretation as probabilistic graphical models by swapping the "direction" of the network, so that the label is treated as the "input", and the data are treated as the "output".…”
Section: Classificationmentioning
confidence: 99%
“…PC is, however, a generative model, as suggested by its formulation as variational inference (K. Friston, 2005;Rao & Ballard, 1999). This implies that the PC models surveyed in the previous section can also be used for data generation from labels, as long as certain regularizations are applied due to the ill-posed nature of the inverse problem (Sun & Orchard, 2020). Additionally, PCNs can also be used directly as generative models due to their interpretation as probabilistic graphical models by swapping the "direction" of the network, so that the label is treated as the "input", and the data are treated as the "output".…”
Section: Classificationmentioning
confidence: 99%
“…for implementing amortised inference (Kingma & Welling, 2013;Zhang, Bütepage, Kjellström & Mandt, 2018), ¶ While predictive coding is usually considered to be unsupervised algorithm, it is straightforward to extend the scheme into a supervised setting (Bogacz, 2017;Millidge, Tschantz, Seth & Buckley, 2020c;Sun & Orchard, 2020;Whittington & Bogacz, 2017). This can be achieved by turning the predictive coding network on its head, so that the model tries to generate hidden states (e.g.…”
Section: Predictive Codingmentioning
confidence: 99%
“…However, predictive coding has two limitations: First, it only infers the most likely state of an environment from sensory inputs, rather than the whole posterior distribution, thereby ignoring any uncertainty information [21]. Second, it demonstrated a limited learning performance for generative tasks [26]. Recent work extended predictive coding to improve its learning performance using lateral inhibition and sparse priors [26, 27], however the resulting neural network is still unable to infer posterior distributions.…”
Section: Introductionmentioning
confidence: 99%
“…Second, it demonstrated a limited learning performance for generative tasks [26]. Recent work extended predictive coding to improve its learning performance using lateral inhibition and sparse priors [26, 27], however the resulting neural network is still unable to infer posterior distributions. In addition to predictive coding, other models have been proposed to describe learning of probabilistic models in the brain.…”
Section: Introductionmentioning
confidence: 99%