2016 IEEE 16th International Conference on Data Mining (ICDM) 2016
DOI: 10.1109/icdm.2016.0097
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable Clustering via Discriminative Rectangle Mixture Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 6 publications
0
15
0
Order By: Relevance
“…Most similar to our work, Bertsimas et al formulate the problem of finding an optimal decision tree to perform clustering as a mixed integer optimization problem and construct an approximate solution via coordinate descent (Bertsimas, Orfanoudaki, and Wiberg 2021). In a similar vein, recent work has looked at building rule sets to explain clusters (Chen et al 2016;Chen 2018;Carrizosa et al 2021b;Pelleg and Moore 2001). In contrast to the existing state of the art, our approach is the first to look at a more general function class to explain clusters.…”
Section: Related Workmentioning
confidence: 69%
“…Most similar to our work, Bertsimas et al formulate the problem of finding an optimal decision tree to perform clustering as a mixed integer optimization problem and construct an approximate solution via coordinate descent (Bertsimas, Orfanoudaki, and Wiberg 2021). In a similar vein, recent work has looked at building rule sets to explain clusters (Chen et al 2016;Chen 2018;Carrizosa et al 2021b;Pelleg and Moore 2001). In contrast to the existing state of the art, our approach is the first to look at a more general function class to explain clusters.…”
Section: Related Workmentioning
confidence: 69%
“…It achieves superior performance when compared to K-Means. In the study of [21], two different methods with decision rules are applied to describe clusters. One defined the clustering model by providing a set of interpretable rules for each cluster, and the other used matrix decision rules with all the features to construct a clustering model.…”
Section: ) Global Perspectivementioning
confidence: 99%
“…This makes the deep learning model itself not a complete black box and the interpretable approaches based on the DL model are unique. [16] local linear model feature CPAR [17] global decision tree/rule feature Trepan [22], [23], [24], REFNE [26] global decision tree/rule √ feature [19], [20], [21] global decision tree/rule feature Anchors [30], PALM [31], [32] local decision tree/rule √ feature MMD-critic [37] global data point √ data point influence function [38] global data point data point SHAP [40], [42] local Shapley value √ feature [48], [49] global KG feature [50], [51], [52] global KG semantic relations RKGE [53], KPRN [54] global KG decision Path [56] local KG semantic relations [57] local KG semantic relations [64] local NN feature CAM [71], Grad-CAM [70], DeepLIFT [72], LRP [73], IBD [74] local NN feature SVCCA [75] global NN neuronal relations ACD [76] global/l ocal NN feature [78], [79], [80] global NN neuronal semantic [81], [82], [83] local NN feature [85], [86], [87] local NN data/feature…”
Section: A Comparison Analysismentioning
confidence: 99%
“…There also exist methods that can provide post-hoc explanations, for example using a consistent set of decision rules, which lead to a prediction (Lakkaraju, Bach, and Leskovec 2016) or a clustering (Kim, Shah, and Doshi-Velez 2015;Chen et al 2016). These rules together characterise the decision boundary for a cluster, whereas we are interested those patterns that characterise the similarities and differences between components.…”
Section: Related Workmentioning
confidence: 99%