Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.1109/tnnls.2021.3112897
|View full text |Cite
|
Sign up to set email alerts
|

Entropic Out-of-Distribution Detection: Seamless Detection of Unknown Examples

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…A widely used baseline expresses the OOD score for an image directly from discriminative predictions as [ 40 ], where computes dense logits based on the input image. Entropy-based detectors can deliver a similar performance [ 41 , 42 ]. Another line of work improves upon these baselines by pre-processing the input with anti-adversarial perturbations [ 32 ], which cause significant computational overhead.…”
Section: Related Workmentioning
confidence: 99%
“…A widely used baseline expresses the OOD score for an image directly from discriminative predictions as [ 40 ], where computes dense logits based on the input image. Entropy-based detectors can deliver a similar performance [ 41 , 42 ]. Another line of work improves upon these baselines by pre-processing the input with anti-adversarial perturbations [ 32 ], which cause significant computational overhead.…”
Section: Related Workmentioning
confidence: 99%
“…The first system was adapted from the learning confidence out-of-distribution detection model [24] which estimates learning confidence for neural networks and produces intuitively interpretable outputs. The other approach incorporates an entropic out-of-distribution detection model [40] which replaces the SoftMax loss with a novel loss function dealing with the weaknesses of SoftMax loss anisotropy, and tends to produce low entropy probability distributions which breaks the principle of maximum entropy. In OOD detection problems, only ID data is taken as training and validation dataset and combined ID and OOD data is taken as testing dataset.…”
Section: Out-of-distribution Detectionmentioning
confidence: 99%
“…al. in [40] solves the SoftMax loss drawbacks by replacing it with the Isotropy Maximization (IsoMax) loss. The proposed IsoMax loss is isotropic and follows the maximum entropy principle.…”
Section: ) Confidence-based Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…This approach only obtained a 63% detection accuracy on the image data, which is not desirable for security critical applications (5) . Another method of thresholding the distance from a Gaussian distribution fitted to the target class representations, the author of (6) investigates a technique for identifying adversarial samples and states that the Mahalanobis distance detecting technique is the most vulnerable to attack. Fabio Carrara and fellow authors have proposed ENAD, an ensemble approach for adversarial detection that improves performance by integrating layer-specific scores from three independent detectors (LID, Mahalanobis, and OCSVM), achieving significantly enhanced performance on benchmark datasets, methods, and attacks but requiring training (7) .…”
Section: Introductionmentioning
confidence: 99%