2009 Ninth IEEE International Conference on Data Mining 2009
DOI: 10.1109/icdm.2009.32
|View full text |Cite
|
Sign up to set email alerts
|

Learning the Shared Subspace for Multi-task Clustering and Transductive Transfer Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
73
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 132 publications
(73 citation statements)
references
References 24 publications
0
73
0
Order By: Relevance
“…(18) and (19) can measure the influence of the independence information and correlation information, respectively. Furthermore, we can define an index, i.e., IND/CORR, to reflect the influence of both kinds of information.…”
Section: Further Analysis Of the Independence And Correlation Informamentioning
confidence: 99%
See 1 more Smart Citation
“…(18) and (19) can measure the influence of the independence information and correlation information, respectively. Furthermore, we can define an index, i.e., IND/CORR, to reflect the influence of both kinds of information.…”
Section: Further Analysis Of the Independence And Correlation Informamentioning
confidence: 99%
“…Based on such a principle, multi-task learning methods emerge and are attracting more and more attentions in the fields of machine learning, data mining and pattern recognition. The existing multi-task learning methods can be categorized into three types: (1) Multi-task classification learning [2,[5][6][7][14][15][16]21,22,33,36,43]; (2) Multi-task clustering [3,18,19,23,42,44]; and (3) Multi-task regression learning [8,27,34,37,45]. Although these works have demonstrated the significance of multi-task learning and certain effectiveness in different real-world applications, the current multi-task learning methods still cannot keep up with the real-world requirements, particularly in regression tasks.…”
Section: Introductionmentioning
confidence: 99%
“…Pan et al [20] proposed a dimensionality reduction approach to find out this latent feature space, in which supervised learning algorithms can be applied to train classification models. Gu et al [7] learnt the shared subspace among multiple domains for clustering and transductive transfer classification. In their problem formulation, all the domains have the same cluster centroid in the shared subspace.…”
Section: Cross-domain Learningmentioning
confidence: 99%
“…There are many papers appear in recent years, and they can be grouped into three types of techniques used for knowledge transfer, namely feature selection based [11,4,29], feature space mapping [20,22,7], weight based [5,6,10].…”
Section: Cross-domain Learningmentioning
confidence: 99%
See 1 more Smart Citation