2007
DOI: 10.1007/978-3-540-75225-7_14
|View full text |Cite
|
Sign up to set email alerts
|

On Universal Transfer Learning

Abstract: The aim of transfer learning is to reduce sample complexity required to solve a learning task by using information gained from solving related tasks. Transfer learning has in general been motivated by the observation that when people solve problems, they almost always use information gained from solving related problems previously. Indeed, the thought of even children trying to solve problems tabula rasa seem absurd to us. Despite this fairly obvious observation, typical machine learning algorithms consider so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
13
0

Year Published

2009
2009
2013
2013

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(14 citation statements)
references
References 53 publications
1
13
0
Order By: Relevance
“…The training for each data set in every case was done using the training set, while the results reported are all for performance on the testing set. We also performed 135 other transfer experiments using other combinations of the percent of data used as training and testing sets [29]. The results of these experiments also, by and large, exhibit the same properties as the results in this paper, which we now discuss and interpret.…”
Section: Setup Of the Experimentssupporting
confidence: 69%
See 3 more Smart Citations
“…The training for each data set in every case was done using the training set, while the results reported are all for performance on the testing set. We also performed 135 other transfer experiments using other combinations of the percent of data used as training and testing sets [29]. The results of these experiments also, by and large, exhibit the same properties as the results in this paper, which we now discuss and interpret.…”
Section: Setup Of the Experimentssupporting
confidence: 69%
“…where the constant of inequality now depends only on U. Similar comments also apply to Theorem 3.9 and we refer the interested reader to [29] for a fuller development of the above.…”
Section: Competitive Optimality Of Universal Priorsmentioning
confidence: 79%
See 2 more Smart Citations
“…Ling et al [12] developed a new spectral classification algorithm that optimized an objective function to seek for the maximal consistency between the supervised information from the source domain and the intrinsic structure of the target domain. Mahmud et al [13] studied transfer learning from the perspective of algorithmic information theory. They measured the relatedness between tasks, and then decided how much information to transfer and how to transfer.…”
Section: Computer Science and Technologymentioning
confidence: 99%