2018
DOI: 10.1016/j.neunet.2018.08.018
|View full text |Cite
|
Sign up to set email alerts
|

LCD: A Fast Contrastive Divergence Based Algorithm for Restricted Boltzmann Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…SRBM A sparse variant that each hidden unit connects to part of the visible units, preventing the model overfitting based on hierarchical latent tree analysis. FRBM (Ning et al, 2018) A fast variant trained by the lean CD algorithm in which the bounds-based filtering and delta product reduce the redundant dot product calculations. TTRBM (Ju et al, 2019) A compact variant that the parameters between the visible layer and hidden layer are reduced by transforming into the tensor-train format.…”
Section: The Deep Belief Net (Dbn)mentioning
confidence: 99%
See 1 more Smart Citation
“…SRBM A sparse variant that each hidden unit connects to part of the visible units, preventing the model overfitting based on hierarchical latent tree analysis. FRBM (Ning et al, 2018) A fast variant trained by the lean CD algorithm in which the bounds-based filtering and delta product reduce the redundant dot product calculations. TTRBM (Ju et al, 2019) A compact variant that the parameters between the visible layer and hidden layer are reduced by transforming into the tensor-train format.…”
Section: The Deep Belief Net (Dbn)mentioning
confidence: 99%
“…For instance, to avoid network overfitting, Chen, Zhang, Yeung, and Chen (2017) designed the sparse Boltzmann machine that learns the network structure based the hierarchical latent tree. Ning, Pittman, and Shen (2018) introduced fast contrastive-divergence algorithms to RBMs, where the bounds-based filtering and delta product are used to reduce the redundant dot product calculations in computations. To protect the internal structure of multidimensional data, Ju et al (2019) proposed the tensor RBM, learning the high-level distribution hidden in multidimensional data, in which tensor decomposition is used to avoid the dimensional disaster.…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…Then, the hidden layer output vector calculated by {b, c, W} can be input to the next RBM. The Contrastive Divergence (CD) algorithm can be adopted to train RBM layer by layer quickly (Ning et al, 2018). The whole pretraining process can be regarded as an unsupervised learning process, which make the DBN have superior performance in feature extraction and automatic data dimension reduction (Cao et al, 2022).…”
Section: P(x Hmentioning
confidence: 99%
“…Fischer and Igel [21] discussed different RBM learning algorithms such as CD and parallel tempering. The modified CD algorithm called 'Lean Contrastive Divergence' was proposed by Ning et al [22] to increase the speed of the learning and prediction process. Pujahari et al [23] proposed a CF model by taking the preference relationship as an input to RBM and by combining the side information of the movies, trying to generate the ranking of the items.…”
Section: Introductionmentioning
confidence: 99%