2021
DOI: 10.1109/tcyb.2020.3033003
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable Rule Discovery Through Bilevel Optimization of Split-Rules of Nonlinear Decision Trees for Classification Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 29 publications
0
9
0
Order By: Relevance
“…Rule extraction for interpretability purposes is very common in tree-based approaches [36][37][38] , however rule extraction from neural networks is rare, with the majority of methods being limited to rule extraction from networks with one hidden layer 18 . Zilke et al 39 extract rules from networks with more than one hidden layer (i.e., DNNs), however, the approach has scalability issues due to prohibitively high memory and time consumption 4,39 .…”
Section: Discussionmentioning
confidence: 99%
“…Rule extraction for interpretability purposes is very common in tree-based approaches [36][37][38] , however rule extraction from neural networks is rare, with the majority of methods being limited to rule extraction from networks with one hidden layer 18 . Zilke et al 39 extract rules from networks with more than one hidden layer (i.e., DNNs), however, the approach has scalability issues due to prohibitively high memory and time consumption 4,39 .…”
Section: Discussionmentioning
confidence: 99%
“…Recently, an evolutionary algorithm based non-linear decision tree classifier was proposed in [8]. The classifier is represented in the form of a non-linear decision tree as shown in Figure 7.…”
Section: E Nonlinear Decision Tree (Nldt) Approachmentioning
confidence: 99%
“…• Training is slower as compared to CART and SVM. Details regarding the bilevel optimization algorithm and parameter settings can be found from [8].…”
Section: Advantagesmentioning
confidence: 99%
See 2 more Smart Citations