2016
DOI: 10.1016/j.neucom.2015.01.096
|View full text |Cite
|
Sign up to set email alerts
|

Extreme learning machine based transfer learning for data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
26
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 66 publications
(26 citation statements)
references
References 29 publications
0
26
0
Order By: Relevance
“…Among the parameter transfer approaches, the majority of related works incorporated the source model information into the target by regularizing the difference of the parameters between the source and the target domain [13]- [16]. The representative method is the adaptive SVM (A-SVM) [13], which learns from the source domain parameters by directly regularizing the distance between the learned model and the target model.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Among the parameter transfer approaches, the majority of related works incorporated the source model information into the target by regularizing the difference of the parameters between the source and the target domain [13]- [16]. The representative method is the adaptive SVM (A-SVM) [13], which learns from the source domain parameters by directly regularizing the distance between the learned model and the target model.…”
Section: Related Workmentioning
confidence: 99%
“…Apart from this, the parameter transfer approach is another highly concerned line of works. It assumes that the transferred knowledge has been encoded into the hyperparameters of the classification model [13]- [16]. Therefore, the source model and target model should share some parameters or prior distribution of the model parameters.…”
Section: Introductionmentioning
confidence: 99%
“…It focuses on the knowledge transfer between different but similar areas, tasks and distributions. When the task from one new domain comes, new domain samples are relabeled costly, and it would be a waste to discard all the old domain data (Li et al, 2015). Wang et al (2014) propose a transfer learning method for collaborative filtering, called Feature Subspace Transfer (FST) to overcome the sparsity problem in collaborative filtering.…”
Section: Related Workmentioning
confidence: 99%
“…Transfer learning has achieved remarkable results in resisting this challenge by transferring knowledge from source to target domains with different distributions [13]. Therefore, transfer learning attracts more and more researcher attention and has made great progress: Gao et al [14] proposed a local weighted embedded transfer learning algorithm LWE; a feature-based space transfer learning method LMPROJ are proposed by Brain et al [15]; Lu et al [16] proposed a selective transfer algorithm STLCF for collaborative filtering; Long et al [17] proposed an SVM-based least squares transfer learning framework ARTL; Xie et al [18] applied transfer learning to incremental learning and proposed an STIL algorithm; Li et al [19] proposed a new transfer learning algorithm TL-DAKELM based on the extreme learning machine; Li et al [20] proposed a transfer learning algorithm, RankRE-TL.…”
Section: Introductionmentioning
confidence: 99%