2021
DOI: 10.14293/s2199-1006.1.sor-.pps25dj.v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Shallow Gibbs Network, Double Backpropagation and Differential Machine learning

Abstract: We have built a Shallow Gibbs Network model as a Random Gibbs Network Forest to reach the performance of the Multilayer feedforward Neural Network in a few numbers of parameters, and fewer backpropagation iterations. To make it happens, we propose a novel optimization framework for our Bayesian Shallow Network, called the {Double Backpropagation Scheme} (DBS) that can also fit perfectly the data with appropriate learning rate, and which is convergent and universally applicable to any Bayesian neural network pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 85 publications
(100 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?