Proceedings of the ACM/IEEE 12th International Conference on Cyber-Physical Systems 2021
DOI: 10.1145/3450267.3450535
|View full text |Cite
|
Sign up to set email alerts
|

Real-time detectors for digital and physical adversarial inputs to perception systems

Abstract: Deep neural network (DNN) models have proven to be vulnerable to adversarial attacks. In this paper, we propose VisionGuard, a novel attack-and dataset-agnostic and computationally-light defense mechanism for adversarial inputs to DNN-based perception systems. In particular, VisionGuard relies on the observation that adversarial images are sensitive to lossy compression transformations. Specifically, to determine if an image is adversarial, VisionGuard checks if the output of the target classifier on a given i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…Recently, the authors in [70] proposed an adversarial example detection technique named VisionGuard. The motivation behind this approach is the observation that adversarial images are sensitive to lossy compression based transformations.…”
Section: Unsupervised Approachesmentioning
confidence: 99%
“…Recently, the authors in [70] proposed an adversarial example detection technique named VisionGuard. The motivation behind this approach is the observation that adversarial images are sensitive to lossy compression based transformations.…”
Section: Unsupervised Approachesmentioning
confidence: 99%