2023
DOI: 10.1109/tpami.2022.3205036
|View full text |Cite
|
Sign up to set email alerts
|

Regularized Multi-Output Gaussian Convolution Process With Domain Adaptation

Abstract: Multi-output Gaussian process (MGP) has been attracting increasing attention as a transfer learning method to model multiple outputs. Despite its high flexibility and generality, MGP still faces two critical challenges when applied to transfer learning. The first one is negative transfer, which occurs when there exists no shared information among the outputs. The second challenge is the input domain inconsistency, which is commonly studied in transfer learning yet not explored in MGP. In this paper, we propose… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 21 publications
0
1
0
Order By: Relevance
“…It pools all training and testing CM signals into a large multivariate GP, through which information transfer occurs from the historical to the testing units. Based on this idea, multiple studies further improve the model by incorporating auxiliary data [3], adaptively selecting more informative units [23], and distributing model inference efforts to individual units while preserving their privacy [24], among others. They all are capable of personalized predictions using non-parametric modeling, a key benefit inherited from GPs.…”
Section: Related Workmentioning
confidence: 99%
“…It pools all training and testing CM signals into a large multivariate GP, through which information transfer occurs from the historical to the testing units. Based on this idea, multiple studies further improve the model by incorporating auxiliary data [3], adaptively selecting more informative units [23], and distributing model inference efforts to individual units while preserving their privacy [24], among others. They all are capable of personalized predictions using non-parametric modeling, a key benefit inherited from GPs.…”
Section: Related Workmentioning
confidence: 99%
“…The objection function (Equation ( 2)) is implemented to realize the identification of different parameters. For the new input point x * , based on multivariate normal theory, the posterior distribution of the target output f t x * ð Þ can be expressed as follows [34]:…”
Section: Preliminaries Of Mgcpmentioning
confidence: 99%
“…This method improve the accuracy of the model, but the computation burden also increases. Wang et al [34] established a MGCP to deal with the inconsistent input domain. The method marginalizes the inconsistent features to realize the domain adaptation.…”
Section: Preliminaries Of Mgcpmentioning
confidence: 99%