Proceedings of the 23rd International Conference on Machine Learning - ICML '06 2006
DOI: 10.1145/1143844.1143934
|View full text |Cite
|
Sign up to set email alerts
|

Constructing informative priors using transfer learning

Abstract: Many applications of supervised learning require good generalization from limited labeled data. In the Bayesian setting, we can try to achieve this goal by using an informative prior over the parameters, one that encodes useful domain knowledge. Focusing on logistic regression, we present an algorithm for automatically constructing a multivariate Gaussian prior with a full covariance matrix for a given supervised learning task. This prior relaxes a commonly used but overly simplistic independence assumption, a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
153
0

Year Published

2009
2009
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 210 publications
(157 citation statements)
references
References 11 publications
1
153
0
Order By: Relevance
“…Other work on transfer (e.g., [Raina et al, 2006, Niculescu-Mizil and Caruana, 2007, Torrey et al, 2007, Richardson and Domingos, 2006) also share some resemblance with our work. They focus on improving the performance of learning by transferring previously acquired knowledge from another domain of interest.…”
Section: Discussionsupporting
confidence: 82%
“…Other work on transfer (e.g., [Raina et al, 2006, Niculescu-Mizil and Caruana, 2007, Torrey et al, 2007, Richardson and Domingos, 2006) also share some resemblance with our work. They focus on improving the performance of learning by transferring previously acquired knowledge from another domain of interest.…”
Section: Discussionsupporting
confidence: 82%
“…A third form of transfer learning is model adaptation, where auxiliary data is used to regularize the parameters of a target model, which can be either generative [60][61][62][63][64][65] or discriminative [66][67][68][69][70]. Although this is sometimes denoted domain adaptation, the latter usually refers to methods that regularize the target feature space, rather than the models themselves.…”
Section: Related Workmentioning
confidence: 99%
“…In image, text analysis or robotics many methods have been devised for knowledge transfer. Related machine learning subjects include: learning from hints [104], lifelong learning [105], multi-task learning [106], cross-domain learning [107,108], cross-category learning [109] and selftaught learning [110]. EigenTransfer algorithm [111] tries to unify various transfer learning ideas representing the target task by a graph.…”
Section: Transfer Of Knowledgementioning
confidence: 99%