Proceedings of the Eleventh ACM SIGKDD International Conference on Knowledge Discovery in Data Mining 2005
DOI: 10.1145/1081870.1081878
|View full text |Cite
|
Sign up to set email alerts
|

Rule extraction from linear support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
58
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 111 publications
(60 citation statements)
references
References 13 publications
0
58
0
Order By: Relevance
“…However, recently, a set of methods were developed to simplify model complexity while managing non-linearity [16,17]. One representative scheme is to build multiple local models to approximate the global one [11,18,19] [20,21]. Martens et al [8] provided a comprehensive study of rule extraction from SVMs.…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…However, recently, a set of methods were developed to simplify model complexity while managing non-linearity [16,17]. One representative scheme is to build multiple local models to approximate the global one [11,18,19] [20,21]. Martens et al [8] provided a comprehensive study of rule extraction from SVMs.…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…Quite a lot of recent work has addressed the problem of extracting explanations from classifiers. For instance, there are many approaches for extracting explanations from SVM (e.g., [4,9]). Most of these works use the SVM's "support vectors" to produce rules.…”
Section: Interpretable Modelsmentioning
confidence: 99%
“…Malone et al [3] have used Kohonen network for data mining and have used Kohonen feature maps to formulate rules. Fung et al [4] used Support Vector Machines (SVM) to extract rules from datasets by expressing the variable space as hyper-cubes. Ali et al [5] have shown it is possible to extract useful rules using decision tree induction by suggesting improvements to the existing C4.5 decision tree algorithm.…”
Section: Introductionmentioning
confidence: 99%