2021
DOI: 10.1109/tpami.2021.3066111
|View full text |Cite
|
Sign up to set email alerts
|

Graph Regularized Autoencoder and its Application in Unsupervised Anomaly Detection

Abstract: Dimensionality reduction is a crucial first step for many unsupervised learning tasks including anomaly detection and clustering. Autoencoder is a popular mechanism to accomplish dimensionality reduction. In order to make dimensionality reduction effective for high-dimensional data embedding nonlinear low-dimensional manifold, it is understood that some sort of geodesic distance metric should be used to discriminate the data samples. Inspired by the success of geodesic distance approximators such as ISOMAP, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 44 publications
0
12
0
Order By: Relevance
“…Such 3D information is of pivotal importance to predicting crystal structure outcomes, such as bond lengths, polyhedral distortions, and long-range bonding motifs. 3D graph neural networks (3D GNNs), that explicitly encode 3D positional information about molecular and materials composition and structure, or GNNs capable of learning low-dimensional embedding manifold in a highdimensional design space, 99 will enable reaction precursors to be mapped to intermediates and final productsthereby establishing proxies for the most probable reaction trajectories. The mathematical behavior of traditional material precursor− crystal−structure−property functions is inadequate to model far-from equilibrium materials.…”
Section: Prospects For Improved Navigation Of the Design Spacementioning
confidence: 99%
“…Such 3D information is of pivotal importance to predicting crystal structure outcomes, such as bond lengths, polyhedral distortions, and long-range bonding motifs. 3D graph neural networks (3D GNNs), that explicitly encode 3D positional information about molecular and materials composition and structure, or GNNs capable of learning low-dimensional embedding manifold in a highdimensional design space, 99 will enable reaction precursors to be mapped to intermediates and final productsthereby establishing proxies for the most probable reaction trajectories. The mathematical behavior of traditional material precursor− crystal−structure−property functions is inadequate to model far-from equilibrium materials.…”
Section: Prospects For Improved Navigation Of the Design Spacementioning
confidence: 99%
“…3. Given the training data matrix X ∈ R N ×m , according to (23) and ( 9), we have the system feature matrix T ∈ R N ×a and residual matrix X ∈ R N ×m . Set x ∈ R m to be a sample of X, its Hotelling's T 2 statistics and squared prediction error (SP E) statistics can be defined as follows:…”
Section: Dae-pca Based Nonlinear Fault Detectionmentioning
confidence: 99%
“…As defined in (23), v is a latent variable for constructing process variables (u, y) in the nominal operation. The fact (24) reveals that v preserves the exact amount of information needed for constructing the process variables (u, y) in the nominal operation. As the latent variable, v is achieved by maximally compressed mapping of (u, y) and preserves as much as possible the information in (u, y) by its reconstruction.…”
Section: A Introduction Of the Basic Ideamentioning
confidence: 99%