Proceedings 2001 IEEE International Conference on Data Mining
DOI: 10.1109/icdm.2001.989589
|View full text |Cite
|
Sign up to set email alerts
|

Incremental learning with support vector machines

Abstract: Support Vector Machines (SVMs) have become a popular tool for learning with large amounts of high dimensional data. However, it may sometimes be preferable to learn incrementally from previous SVM results, as computing a SVM is very costly in terms of time and memory consumption or because the SVM may be used in an online learning setting. In this paper an approach for incremental learning with Support Vector Machines is presented, that improves existing approaches. Empirical evidence is given to prove that th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
93
0

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 230 publications
(93 citation statements)
references
References 5 publications
(1 reference statement)
0
93
0
Order By: Relevance
“…Therefore, the possibility of incremental learning could be very helpful especially when the decision tuples are obtained at different intervals. There exist several approaches to incremental learning based on Support Vector Machines [71,68,32,18,78,10].…”
Section: Applymentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, the possibility of incremental learning could be very helpful especially when the decision tuples are obtained at different intervals. There exist several approaches to incremental learning based on Support Vector Machines [71,68,32,18,78,10].…”
Section: Applymentioning
confidence: 99%
“…However, Rüping [68] argues that the assumption that a new decision relation is appropriate is wrong, i.e., the influence of previous support vectors on the decision function in the next learning step may be very small if the new decision tuples are distributed differently, so the final hyperplane does not differ significantly from the hyperplane obtained from the new decision relation. He claims that a new SVM result largely ignores the old support vectors and almost always corresponds to the decision function that would have been learned on the second decision relation alone.…”
Section: Applymentioning
confidence: 99%
“…But efficient online algorithm can make sure that parameter update is timely, and effectively. So it requires for efficiently completion of once-through operation in sampling period, which requires relatively low computational complexity [7][8]. To realize this objective, Suykens [2] has proposed online learning algorithm using Multi-mode constraints, OLAMMC, which can convert multi-mode constraints learning issues to solution by linear system of equations so as to improve the arithmetic speed.…”
Section: Introductionmentioning
confidence: 99%
“…Some researchers have tried some incremental methods used in SVM training. [11] proposed an incremental learning of SVM, that The support vectors from previous training set will be used involved in the new SVM optimization problem with different weights on them. This method can work for balanced data.…”
Section: Introductionmentioning
confidence: 99%