Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop (Cat. No.00TH8501)
DOI: 10.1109/nnsp.2000.889436
|View full text |Cite
|
Sign up to set email alerts
|

An ensemble learning approach to independent component analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 13 publications
0
14
0
1
Order By: Relevance
“…In this section we analyse theoretically how the choice of the form of the posterior approximation q(S) of the sources affects the solution which optimises the cost function (4).…”
Section: Effect Of Posterior Approximation: Theorymentioning
confidence: 99%
See 1 more Smart Citation
“…In this section we analyse theoretically how the choice of the form of the posterior approximation q(S) of the sources affects the solution which optimises the cost function (4).…”
Section: Effect Of Posterior Approximation: Theorymentioning
confidence: 99%
“…Recently several methods for variational Bayesian learning of linear ICA models and their extensions have been reported in the literature [1,2,3,4,5,6,7,8]. The basic idea in these approaches is to approximate the true posterior probability density of the unknown variables by a function which has a restricted form.…”
Section: Introductionmentioning
confidence: 99%
“…The prior over the precisions is a Gamma distribution (9) the prior over the means is a univariate normal (10) and the prior over the weights is a zero-mean Gaussian with an isotropic covariance having precision (11) where…”
Section: A Model Priorsmentioning
confidence: 99%
“…Although the Bayesian methodology has a long history, the use of VB is relatively new; the key idea of VB is to find a tractable approximation to the true posterior density that minimizes the Kullback-Leibler (KL) divergence [8]. Notable recent applications are to principal component analysis [9] and independent component analysis [10]. We have also published short conference papers summarizing some key results for standard autoregressive models [11] and non-Gaussian AR models [12].…”
Section: Introductionmentioning
confidence: 99%
“…It has been applied to ICA and a wide variety of other models (see e.g. Hinton and van Camp, 1993;Barber and Bishop, 1998;Attias, 1999;Miskin and MacKay, 2000;Ghahramani and Hinton, 2000;Choudrey et al, 2000;Chan et al, 2001;Valpola and Karhunen, 2002). An example of applying a variational technique other than ensemble learning to linear ICA has been given by Girolami (2001).…”
Section: Variational Bayesian Learningmentioning
confidence: 99%