2013
DOI: 10.1109/tnnls.2013.2272403
|View full text |Cite
|
Sign up to set email alerts
|

Semisupervised Multitask Learning With Gaussian Processes

Abstract: We present a probabilistic framework for transferring learning across tasks and between labeled and unlabeled data. The approach is based on Gaussian process (GP) prediction and incorporates both the geometry of the data and the similarity between tasks within a GP covariance, allowing Bayesian prediction in a natural way. We discuss the transfer of learning in a multitask scenario in the two cases where the underlying geometry is assumed to be the same across tasks and where different tasks are assumed to hav… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
27
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(31 citation statements)
references
References 25 publications
0
27
0
Order By: Relevance
“…Supervised learning addresses the problems where, for each input x, there is a corresponding observed output y, whereas, in the case of the latter two categories, this correspondence cannot be found due to the lack of data. Unsupervised learning explores interesting patterns in the input data without a need for explicit supervision [122]. Reinforcement learning assumes that the dynamics of the system under consideration follows a particular class of model [123].…”
Section: Machine Learning Techniquesmentioning
confidence: 99%
“…Supervised learning addresses the problems where, for each input x, there is a corresponding observed output y, whereas, in the case of the latter two categories, this correspondence cannot be found due to the lack of data. Unsupervised learning explores interesting patterns in the input data without a need for explicit supervision [122]. Reinforcement learning assumes that the dynamics of the system under consideration follows a particular class of model [123].…”
Section: Machine Learning Techniquesmentioning
confidence: 99%
“…3 we include the real part of the prediction in (14) and the grey shaded area that represents the point-wise mean plus and minus two times the standard deviation. The mean of the prediction for the proper case (14) Proper CGPR mean (20) Noisy training samples Proper CGPR mean (20) Training samples Fig. 4: Imaginary parts of the sample function of the process f (x), the predictive CGPR mean (14), and the predictive mean for the proper CGPR case (20), versus the real part of the input x r , for x j = −0.1515.…”
Section: Hyperparameters Estimationmentioning
confidence: 99%
“…The mean of the prediction for the proper case (14) Proper CGPR mean (20) Noisy training samples Proper CGPR mean (20) Training samples Fig. 4: Imaginary parts of the sample function of the process f (x), the predictive CGPR mean (14), and the predictive mean for the proper CGPR case (20), versus the real part of the input x r , for x j = −0.1515. in (20) is plotted as a dashed line.…”
Section: Hyperparameters Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…The Gaussian process is drawing more attention from the category of different approaches in the machine learning due to its inherent compatibility for widespread application likewise in regression, classification, 26 adaptive control, multiclass learning, 27 data association, and a semi-definite programming solution. 28 Grande et al 29 explain the Gaussian process regression (GPR) framework as a complete mathematical explanation regarding the usual complex value solution.…”
mentioning
confidence: 99%