2008
DOI: 10.3233/ida-2008-12305
|View full text |Cite
|
Sign up to set email alerts
|

Improving the performance of an incremental algorithm driven by error margins

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…In IADEM [58] and IADEMc [59] the tree construction is also based on the Hoeffding bound. The tree growth is stopped using the error rate given as a parameter to the algorithm.…”
Section: Main Algorithms In the Literaturementioning
confidence: 99%
“…In IADEM [58] and IADEMc [59] the tree construction is also based on the Hoeffding bound. The tree growth is stopped using the error rate given as a parameter to the algorithm.…”
Section: Main Algorithms In the Literaturementioning
confidence: 99%
“…In the last few years, the need to extract novel information from data streams has led to the design of different incremental learning architectures that can create data models online as new examples are coming to the system (Aggarwal 2007;del Campo-Á vila et al 2008;Gama and Gaber 2007). This type of learning has received especial attention not only because it enables practitioners to extract key information from problems in which data are continuously generated and where concepts may change over time, e.g., stock market and sensor data among others, but also because it enables them to deal with huge data sets by making them available as data streams.…”
Section: Introductionmentioning
confidence: 99%
“…To solve the problem, various research work used interval estimation [5,8,6,9]. These approaches rank the possible splits by using a split evaluation function, and the best split is selected when an adequate confidence level can be guaranteed.…”
Section: Introductionmentioning
confidence: 99%
“…A higher confidence level used for splitting can provide, with higher probability, a decision tree model that is more similar to that one induced from the entire data stream. Several approaches have computed confidence intervals making no assumption regarding the form of the probability distribution of the input data [5,8,6,9].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation