2018
DOI: 10.1007/s11222-018-9836-0
|View full text |Cite
|
Sign up to set email alerts
|

Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic student-t model

Abstract: We propose the Laplace method to derive approximate inference for Gaussian process (GP) regression in the location and scale parameters of the student-t probabilistic model. This allows both mean and variance of data to vary as a function of covariates with the attractive feature that the student-t model has been widely used as a useful tool for robustifying data analysis. The challenge in the approximate inference for the model, lies in the analytical intractability of the posterior distribution and the lack … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(6 citation statements)
references
References 50 publications
0
6
0
Order By: Relevance
“…So it can be used to extract the edge structure information of an image in image processing. However, the Laplace operator is sensitive to noise during computation, and the Gauss function can reduce the effect of noise by performing low-pass filtering on the image [15] . The image is first filtered by Gauss low-pass filter to reduce the influence of noise, then the edge is extracted by using Laplace operator filter, which is called Laplacian of Gaussian (LOG) operator.…”
Section: Laplacian Of Gaussian Operatormentioning
confidence: 99%
“…So it can be used to extract the edge structure information of an image in image processing. However, the Laplace operator is sensitive to noise during computation, and the Gauss function can reduce the effect of noise by performing low-pass filtering on the image [15] . The image is first filtered by Gauss low-pass filter to reduce the influence of noise, then the edge is extracted by using Laplace operator filter, which is called Laplacian of Gaussian (LOG) operator.…”
Section: Laplacian Of Gaussian Operatormentioning
confidence: 99%
“…For q(f m ) ∼ N (f m |µ m , Σ m ), the updates of µ m (t+1) and Σ m (t+1) follow the foregoing steps, with the derivatives ∂F/∂µ m and ∂F/∂Σ m taking (22).…”
Section: Appendix a Non-negativity Of λ Nnmentioning
confidence: 99%
“…Note that unlike the homoscedastic GP, the inference in HGP is challenging since the model evidence (marginal likelihood) p(y) and the posterior are intractable. To this end, various approximate inference methods, e.g., markov chain monte carlo (MCMC) [14], maximum a posteriori (MAP) [15]- [17], variational inference [18], [19], expectation propagation [20], [21] and Laplace approximation [22], have been used. The most accurate MCMC is quite slow when handling largescale datasets; the MAP is a point estimation which does not integrate the latent variables out, leading to over-fitting and oscillation; the variational inference and its variants, which run fast via maximizing over a tractable and rigorous lower bound of the evidence, provide a trade-off.…”
Section: Introductionmentioning
confidence: 99%
“…If the likelihood is not Gaussian, the marginal likelihood needs to be approximated. Many approximate methods can be used, like Laplace approximation [14], variational method [15] and sampling method [16].…”
Section: A Gaussian Processmentioning
confidence: 99%