2002
DOI: 10.1016/s0304-3975(01)00404-2
|View full text |Cite
|
Sign up to set email alerts
|

On the power of incremental learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0

Year Published

2004
2004
2014
2014

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 19 publications
0
12
0
Order By: Relevance
“…Generally speaking, data stream is a sequence of unbounded, real-time data items with a very high rate that can be read only once by an application (Gaber et al 2003). The restriction placed by the end of this definition is also called one-pass constraint (Aggarwal 2007), which is also claimed by other literature (Sharma 1998;Lange and Grieser 2002;Muhlbaier et al 2009). It has been flourished for quite a few years for the studies of learning from data stream.…”
Section: Introductionmentioning
confidence: 85%
“…Generally speaking, data stream is a sequence of unbounded, real-time data items with a very high rate that can be read only once by an application (Gaber et al 2003). The restriction placed by the end of this definition is also called one-pass constraint (Aggarwal 2007), which is also claimed by other literature (Sharma 1998;Lange and Grieser 2002;Muhlbaier et al 2009). It has been flourished for quite a few years for the studies of learning from data stream.…”
Section: Introductionmentioning
confidence: 85%
“…Therefore, several alternative approaches to incremental learning have been developed, including online learning algorithms that learn one instance at a time [16], [17], and partial memory and boundary analysis algorithms that memorize a carefully selected subset of extreme examples that lie along the decision boundaries [18]- [22]. However, such algorithms have limited applicability for realworld NDE problems due to restrictions on classifier type, number of classes that can be learned, or the amount of data that can be analyzed.…”
Section: A Incremental Learningmentioning
confidence: 99%
“…While a variety of definitions for incremental learning exist in the literature, we propose a general definition due to Muhlbaier et al [67], outlined by several authors [32,37,55,56]. Namely, a learning algorithm is incremental if, for a sequence of training instances (potentially batches of instances), it satisfies the following criteria:…”
Section: Introductionmentioning
confidence: 99%