2003
DOI: 10.1007/978-3-540-45231-7_50
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Regression for Heterogeneous Data Sets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2005
2005
2016
2016

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Xing et al [10] present such a framework for doing regression in heterogenous datasets. However, these techniques perform poorly as the number of such data partitions increases to millions -as in typical P2P systems.…”
Section: B Related Workmentioning
confidence: 99%
“…Xing et al [10] present such a framework for doing regression in heterogenous datasets. However, these techniques perform poorly as the number of such data partitions increases to millions -as in typical P2P systems.…”
Section: B Related Workmentioning
confidence: 99%
“…The basic idea is to learn a model at each site locally (no communication at all) and then, when a new sample comes, predict the output by simply taking an average of the local outputs. Xing et al [30] present such a framework for doing regression in heterogenous datasets. However, these techniques perform poorly as the number of such data partitions increase to millions -as in typical P2P systems.…”
Section: Distributed Multi-variate Regressionmentioning
confidence: 99%
“…Therefore, CML can integrate information based on different context of the local sites. The details of the formal and experimental analysis of CML can be referred in our previous publications [2], [3], [5], [4].…”
Section: Context-sensitive Information Integrationmentioning
confidence: 99%
“…However, existing approaches of distributed data mining (DDM) can only deal with the first feature. Therefore, we proposed a contextbased meta-learning (CML) approach to address both of the characteristics [2], [3], [4].…”
Section: Introductionmentioning
confidence: 99%