2020
DOI: 10.1109/tnnls.2019.2957109
|View full text |Cite
|
Sign up to set email alerts
|

When Gaussian Process Meets Big Data: A Review of Scalable GPs

Abstract: The vast quantity of information brought by big data as well as the evolving computer hardware encourages success stories in the machine learning community. In the meanwhile, it poses challenges for the Gaussian process (GP) regression, a well-known non-parametric and interpretable Bayesian model, which suffers from cubic complexity to data size. To improve the scalability while retaining desirable prediction quality, a variety of scalable GPs have been presented. But they have not yet been comprehensively rev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
320
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 528 publications
(361 citation statements)
references
References 132 publications
1
320
0
Order By: Relevance
“…At present, efficient inference approximation methods like FITC and PITC [16,17,28,46,47,48,49,50], are not very effective for MTGPs, because in this setting, although inducing points can still be treated as free parameters, the task label should be pre-fixed. Interesting future research involves the development of sparse and efficient inference methods for MTGPs [19,51,52,53,54]. The initialization strategy [37,55,56,57] of the coupling hyperparamters C i is also very important for multitask pattern discovery, and needs further study.…”
Section: Resultsmentioning
confidence: 99%
“…At present, efficient inference approximation methods like FITC and PITC [16,17,28,46,47,48,49,50], are not very effective for MTGPs, because in this setting, although inducing points can still be treated as free parameters, the task label should be pre-fixed. Interesting future research involves the development of sparse and efficient inference methods for MTGPs [19,51,52,53,54]. The initialization strategy [37,55,56,57] of the coupling hyperparamters C i is also very important for multitask pattern discovery, and needs further study.…”
Section: Resultsmentioning
confidence: 99%
“…Finally, in order to make our method applicable to a wide range of real projects, its scalability should be improved. A number of approaches to do this has been recently reviewed [8,41].…”
Section: Discussionmentioning
confidence: 99%
“…Let N be the number of samples in training data. We require OðN 2 Þ storage for (N Â N) Gram matrix and OðN 3 Þ computational complexity for inversion and determinant in (16) and (11). This restricts N at most 10,000.…”
Section: Stochastic Variational Gaussian Process (Svgp)mentioning
confidence: 99%