2012
DOI: 10.1561/9781601985590
|View full text |Cite
|
Sign up to set email alerts
|

Kernels for Vector-Valued Functions: A Review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
272
0
2

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 179 publications
(275 citation statements)
references
References 0 publications
1
272
0
2
Order By: Relevance
“…Álvarez et al reviewed at length kernel methods for vector‐valued functions, focusing especially on regularization and Bayesian prospective, connecting the two points of view. They provided a large collection of kernel choices, focusing on separable kernels, sum of separable kernels and further extensions as kernels to learn divergence‐free and curl‐free vector fields.…”
Section: Multi‐output Regressionmentioning
confidence: 99%
“…Álvarez et al reviewed at length kernel methods for vector‐valued functions, focusing especially on regularization and Bayesian prospective, connecting the two points of view. They provided a large collection of kernel choices, focusing on separable kernels, sum of separable kernels and further extensions as kernels to learn divergence‐free and curl‐free vector fields.…”
Section: Multi‐output Regressionmentioning
confidence: 99%
“…In contrast to e.g., [1,23], we do not model and treat each degree of freedom separately, but propose instead to address directly GP regression with vector outputs of dimension p. In odometry model correction, it makes particularly sense for pose residual where the orientation error highly influences the translation residual. We thus resort to a multi-output model where correlations between output components are explicitly represented through a shared input space [31]. The hyperparameter σ 2 (which leads to diagonal output covariance noise) is then replaced with a semidefinite positive (covariance) matrixΣ.…”
Section: Multi-output Gaussian Processesmentioning
confidence: 99%
“…However, only scalar output is modeled in these studies. Direct use of GPR with multivariate output is possible by considering a covariance function that is a function of both the parameters and all components of the output [ Alvarez , ; Conti and O'Hagan , ] although the resulting GPR model will be computationally expensive if we are interested in predicting fine‐resolution solution. An alternative approach is to first reduce the degree of freedom of the multivariate output through principal component analysis [ Higdon et al ., ; Lawrence , ; Liu et al ., ] or wavelets [ Bayarri et al ., ; Drignei et al ., ; Marrel et al ., ].…”
Section: Introductionmentioning
confidence: 99%