2011 IEEE 11th International Conference on Data Mining 2011
DOI: 10.1109/icdm.2011.52
|View full text |Cite
|
Sign up to set email alerts
|

Direct Robust Matrix Factorizatoin for Anomaly Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
61
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 90 publications
(61 citation statements)
references
References 17 publications
0
61
0
Order By: Relevance
“…In order to preserve the geometry of labelsets in their mappings, a manifold regularization term constructed from the graph Laplacian of labelsets is imposed to the decomposition. The decomposition is fast due to an application of the bilateral random projection (BRP) based low-rank approximation [22] [17]. Its convergence to a stationary point is guaranteed.…”
Section: Main Contributionsmentioning
confidence: 99%
“…In order to preserve the geometry of labelsets in their mappings, a manifold regularization term constructed from the graph Laplacian of labelsets is imposed to the decomposition. The decomposition is fast due to an application of the bilateral random projection (BRP) based low-rank approximation [22] [17]. Its convergence to a stationary point is guaranteed.…”
Section: Main Contributionsmentioning
confidence: 99%
“…In these works, the initial formulation of optimizing a matrix rank and l 0 norm is relaxed to optimizing a nuclear norm and l 1 norm. Very recent work [19] [20] shows that this relaxation may not be neccessary and direct factorization is possible.…”
Section: Introductionmentioning
confidence: 99%
“…Our derivation initially follows that of RASL ( [12]); however, our algorithm is different from RASL in several aspects. Our algorithm does not rely on the relaxation of matrix rank to nuclear norm and l 0 norm to l 1 norm; instead we apply direct factorization as in [19] [20]. Our formulation considers entry-wise noise, not considered in RASL.…”
Section: Introductionmentioning
confidence: 99%
“…While the single-view based methods [157,158,143,144] are only designed to detect attribute-outliers. By representing the class-/attribute-outliers in the latent space and original feature space respectively, our proposed method well characterizes both types of outliers simultaneously.…”
Section: Definition Of Multi-view Outliermentioning
confidence: 99%
“…• DRMF, Direct Robust Matrix Factorization method is proposed by Xiong et al [158], which belongs to single-view outlier detection method. It has been shown the superiority to several other single-view baselines, e.g.…”
Section: Real-world Data and Settingsmentioning
confidence: 99%