2011
DOI: 10.1145/2001269.2001295
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised learning of hierarchical representations with convolutional deep belief networks

Abstract: There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks (DBNs); however, scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the convolutional deep belief network , a hierarchical generative model that scales to realistic image sizes. This model is translation-invariant and supports efficient bottom-up and top-down probabilistic inference. Key to our appro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
647
0
4

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 677 publications
(656 citation statements)
references
References 27 publications
5
647
0
4
Order By: Relevance
“…The Convolutional DBN (Lee et al, 2009a) (with a probabilistic variant of MP, Sec. 5.11) combines ideas from CNNs and DBNs, and was successfully applied to audio classification (Lee et al, 2009b).…”
Section: : First Official Competitions Won By Rnns and With Mpcnnsmentioning
confidence: 99%
“…The Convolutional DBN (Lee et al, 2009a) (with a probabilistic variant of MP, Sec. 5.11) combines ideas from CNNs and DBNs, and was successfully applied to audio classification (Lee et al, 2009b).…”
Section: : First Official Competitions Won By Rnns and With Mpcnnsmentioning
confidence: 99%
“…), future work will investigate the use of convolutional deep-belief networks (e.g., Lee et al 2011a) to boost recognition accuracy by learning the localized correlations in the observed spectra.…”
Section: Discussionmentioning
confidence: 99%
“…Figure 1 shows a schematic of the the deep-belief network. The multi-layer DBN can be constructed from several Restricted Boltzmann Machines (Freund & Haussler 1992;Bishop 2006;Le Roux & Bengio 2008;Bengio 2009Bengio , 2012Lee et al 2011a;Hinton 2012;Montavon et al 2012;Fischer & Igel 2014) with the addition of a logistic regression layer at the top of the network. The RBM is a two-layer neural network able to learn the underlying probability distribution over its set of input values.…”
Section: Robertmentioning
confidence: 99%
“…Focusing on these, there are various hierarchically organized models proposed, taking inspiration from different fields of science. There are nature inspired designs [3,4], there are models using grammars [5][6][7], there are very successful approaches using neural networks [8,9]. Learning strategies of such structures are similarly diverse, ranging from semi-automatic methods when the structure is given by human and only its parameters are learned [5] over sophisticated supervised/unsupervised methods of deep learning [10,9,8].…”
Section: Introductionmentioning
confidence: 99%
“…Noticeable difference from the deep learning is that unlike learning deep belief [10] or convolutional neural network [8] this method gives an explicit structure model similar to image grammars and requires less hyper-parameter/design choices -there is actually just one hyper-parameter that needs to be set and that is the maximal allowed portion of non-modelled data. Furthermore, its individual learned compositions can be used as features in more sophisticated classification framework such as SVM or AdaBoost.…”
Section: Introductionmentioning
confidence: 99%