2012
DOI: 10.48550/arxiv.1207.3649
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood

Abstract: We consider probabilistic multinomial probit classification using Gaussian process (GP) priors. The challenges with the multiclass GP classification are the integration over the non-Gaussian posterior distribution, and the increase of the number of unknown latent variables as the number of target classes grows. Expectation propagation (EP) has proven to be a very accurate method for approximate inference but the existing EP approaches for the multinomial probit GP classification rely on numerical quadratures o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Our goal is to predict the class membership for a new input tensor X * given the observed data D = {X , y}. We place Gaussian process priors on the latent function related to each class, which is the common assumption in multiclass GP classification (see (Rasmussen and Williams 2006;Riihimäki, Jylänki, and Vehtari 2012)). This specification results in the following zeromean Gaussian prior for f :…”
Section: Multiclass Gaussian Processes Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Our goal is to predict the class membership for a new input tensor X * given the observed data D = {X , y}. We place Gaussian process priors on the latent function related to each class, which is the common assumption in multiclass GP classification (see (Rasmussen and Williams 2006;Riihimäki, Jylänki, and Vehtari 2012)). This specification results in the following zeromean Gaussian prior for f :…”
Section: Multiclass Gaussian Processes Classificationmentioning
confidence: 99%
“…Gaussian process can be extended to binary classification problems by employing logistic or probit likelihoods (Nickisch and Rasmussen 2008), while multinomial logistic or multinomial probit likelihoods are employed in multiclass Gaussian process classification (Williams and Barber 1998;Chai 2012;Girolami and Rogers 2006). Since exact inference is analytically intractable for logistic and probit likelihoods, approximation inference is widely applied, such as Laplace approximation (Williams and Barber 1998), expectation propagation (Kim and Ghahramani 2006;Riihimäki, Jylänki, and Vehtari 2012) and variational approximation (Girolami and Rogers 2006).…”
Section: Introductionmentioning
confidence: 99%