2011
DOI: 10.1016/j.neucom.2010.09.024
|View full text |Cite
|
Sign up to set email alerts
|

An approximate inference with Gaussian process to latent functions from uncertain data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
25
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(25 citation statements)
references
References 4 publications
0
25
0
Order By: Relevance
“…Among such tools, Gaussian processes (GP) is a powerful and commonly used regression framework, since it is generally considered to be the most flexible and provides prediction uncertainty information [12]. Two important limitations of GP are its computational complexity [13]- [16] and its sensitivity to uncertain inputs [14], [17]- [21]. To alleviate the computational complexity, various sparse GP techniques have been proposed in [13]- [15], while online and distributed GP were treated in [16], [22], [23] and [24]- [26], respectively.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Among such tools, Gaussian processes (GP) is a powerful and commonly used regression framework, since it is generally considered to be the most flexible and provides prediction uncertainty information [12]. Two important limitations of GP are its computational complexity [13]- [16] and its sensitivity to uncertain inputs [14], [17]- [21]. To alleviate the computational complexity, various sparse GP techniques have been proposed in [13]- [15], while online and distributed GP were treated in [16], [22], [23] and [24]- [26], respectively.…”
Section: Introductionmentioning
confidence: 99%
“…No framework has yet been developed to mathematically characterize and understand the spatial predictability of wireless channels with location uncertainty. In this paper, we build on and adapt the framework from [17], [18] to CQM prediction in wireless networks. Our main contributions are as follows:…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, the resulting stochastic process that represents f under random inputs is no longer Gaussian and lacking an analytic formulation [4,5]. However, as demonstrated in [3], based on the work of [5], we can still recover a Gaussian process approximation for f by using the mean and the covariance of the resulting stochastic process under the influence of input noise. In particular, in the case of a constant deterministic mean function µ 0 (x) = m 0 , the mean and the covariance of this noisy process are:…”
Section: Gaussian Process Priors With Uncertain Inputsmentioning
confidence: 99%
“…The posterior of the stochastic process representing f ∼ GP(µ 0 , k) under input noise is not Gaussian due to the complicated forms of Equation 4 and Equation 5 with respect to x and X. Yet we can still obtain a suitable approximation [3,5] for the original f in the noisy input setting by doing inference over a GP with mean m 0 and covariance function k p , as defined in Equation 11. This approximation is obtained as:…”
Section: Gaussian Process Priors With Uncertain Inputsmentioning
confidence: 99%
“…A regression problem could be solved by the gaussian process as follows [14] [19]: Suppose the training set with m data instances is Dataset = < X training , y training >, where the input matrix X training = [x 1 , x 2 , · · · , x m ] consists of n-feature input instances x i (i = 1,2,...m), and y training = [y 1 , y 2 , · · · , y m ] is the output vector which is generated by…”
Section: A Gaussian Process Regressionmentioning
confidence: 99%