Learning in Graphical Models 1998
DOI: 10.1007/978-94-011-5014-9_23
|View full text |Cite
|
Sign up to set email alerts
|

Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
396
0
8

Year Published

2002
2002
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 484 publications
(404 citation statements)
references
References 26 publications
0
396
0
8
Order By: Relevance
“…See e.g. [1] for the weight-space view of Gaussian processes which equivalently leads to Eq. (10) after marginalization over the weights.…”
Section: Training a Gaussian Processmentioning
confidence: 99%
See 1 more Smart Citation
“…See e.g. [1] for the weight-space view of Gaussian processes which equivalently leads to Eq. (10) after marginalization over the weights.…”
Section: Training a Gaussian Processmentioning
confidence: 99%
“…Secondly, we will discuss practical matters regarding the role of hyperparameters in the covariance function, the marginal likelihood and the automatic Occam's razor. For broader introductions to Gaussian processes, consult [1], [2].…”
mentioning
confidence: 99%
“…We use this method to deal with noise in our experiments. The derivation of (18) can be obtained by investigating the close relationship between Gaussian Processes (GP) and SVMs (Opper & Winther, 1999;Wahba, 1999;Williams, 1998). We give a brief description of it in the following for completeness.…”
Section: Input Noisementioning
confidence: 99%
“…For the moment, we do not specify the form of the covariance function and simply assume it is a valid one, generating a positive definite covariance matrix. We refer to [1,2,3,4] for a review of GPs.…”
Section: Introductionmentioning
confidence: 99%