2015
DOI: 10.1109/tsipn.2015.2470125
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble of distributed learners for online classification of dynamic data streams

Abstract: Abstract-We present an efficient distributed online learning scheme to classify data captured from distributed, heterogeneous, and dynamic data sources. Our scheme consists of multiple distributed local learners, that analyze different streams of data that are correlated to a common event that needs to be classified. Each learner uses a local classifier to make a local prediction. The local predictions are then collected by each learner and combined using a weighted majority rule to output the final prediction… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
14
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 22 publications
(14 citation statements)
references
References 43 publications
(90 reference statements)
0
14
0
Order By: Relevance
“…The majority rule is the most widely used fusion rule for ensemble learning since the earliest studies of the subject (Blum 1995;Breiman 1996;Canzian et al 2013;Fan et al 1999;Freund and Schapire 1997;Hadavandi et al 2015;Herbster and Warmuth 1998;Littlestone and Warmuth 1994;Schapire 1990;Stahl et al 2015;Wang et al 2003Wang et al , 2015. In majority rule, the combiner's final decision is made by taking a vote among all the experts at each instant.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The majority rule is the most widely used fusion rule for ensemble learning since the earliest studies of the subject (Blum 1995;Breiman 1996;Canzian et al 2013;Fan et al 1999;Freund and Schapire 1997;Hadavandi et al 2015;Herbster and Warmuth 1998;Littlestone and Warmuth 1994;Schapire 1990;Stahl et al 2015;Wang et al 2003Wang et al , 2015. In majority rule, the combiner's final decision is made by taking a vote among all the experts at each instant.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…In this experiment, we set c = 0.05. We should point out that the experts use all the 9 features all the time; however, at each experiment only one of the features is available as a context for the combiner and the rest are unknown unsupervised versions of the method of tracking the best expert (MTBE) (Herbster and Warmuth 1998), adaptive Perceptron weighted majority rule (APMR) (Canzian et al 2013), and the supervised optimal fusion rule (SOFR) (Chair and Varshney 1986), in term of probability of error, p e . In MTBE, at each instance, the decision of each expert is compared against the actual label in the supervised version (or the pool of the decisions in the unsupervised version).…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Online learning schemes to learn the optimal classifier operating points are proposed and analyzed in [16][17][18]. These works consider a network of processing nodes that observe distributed, heterogeneous, and dynamic data sources.…”
Section: Online Learning Of the Classifier Operating Pointsmentioning
confidence: 99%
“…Reference [16] focuses on regression problems and proposes an online learning scheme in which the amount of information exchanged depends on the estimated correlation among processing nodes; this allows trade-off of system performance and amount of exchanged information. Instead, [17] focuses on classification problems and proposes a learning scheme with the following features: • It requires a minimal exchange of information.…”
Section: Online Learning Of the Classifier Operating Pointsmentioning
confidence: 99%
“…Index Terms-Ensemble learning, Model tree, Personalized predictive models I. INTRODUCTION E NSEMBLE methods [1], [2], [3], [4] are general techniques in machine learning that combine several learners; these techniques include bagging [5], [6], [7], boosting [8], [9] and stacking [10]. Ensemble methods frequently improve predictive performance.…”
mentioning
confidence: 99%