2019
DOI: 10.48550/arxiv.1909.04326
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Universal Physical Camouflage Attacks on Object Detectors

Abstract: In this paper, we study physical adversarial attacks on object detectors in the wild. Prior arts on this matter mostly craft instance-dependent perturbations only for rigid and planar objects. To this end, we propose to learn an adversarial pattern to effectively attack all instances belonging to the same object category (e.g., person, car), referred to as Universal Physical Camouflage Attack (UPC). Concretely, UPC crafts camouflage by jointly fooling the region proposal network, as well as misleading the clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…We evaluate the generated adversarial examples from the following two aspects: the success rate of adversarial examples and the naturalness of adversarial examples. First, similar to the existing works [20] and [34], we calculate the success rate R s of adversarial examples as follows [20,34]:…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We evaluate the generated adversarial examples from the following two aspects: the success rate of adversarial examples and the naturalness of adversarial examples. First, similar to the existing works [20] and [34], we calculate the success rate R s of adversarial examples as follows [20,34]:…”
Section: Methodsmentioning
confidence: 99%
“…Recently, Huang et al [34] develop an Universal Physical Camouflage (UPC) attack for object detectors, which can attack all instances of the same target class (e.g., all cars in an input image) with the generated universal pattern. The above attack methods [22][23][24][25]34] add the perturbations into the target objects to generate the adversarial examples, while some other works [35][36][37] can launch the physical adversarial attacks without manipulating the target objects. Huang et al [35] craft an adversarial example that looks like the advertising signboard.…”
Section: Physical Adversarial Attacksmentioning
confidence: 99%
See 1 more Smart Citation