2020
DOI: 10.48550/arxiv.2010.11782
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adversarial Attacks on Binary Image Recognition Systems

Abstract: We initiate the study of adversarial attacks on models for binary (i.e. black and white) image classification. Although there has been a great deal of work on attacking models for colored and grayscale images, little is known about attacks on models for binary images. Models trained to classify binary images are used in text recognition applications such as check processing, license plate recognition, invoice processing, and many others. In contrast to colored and grayscale images, the search space of attacks … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…Despite the growing number of Spiking Neural Network deployed on digital (Davies et al, 2018 ) and analog (Moradi et al, 2018 ) neuromorphic hardware, robustness to adversarial perturbations has received comparatively little attention by the research community. Some methods proposed for attacking binary inputs have focused on brute-force searches with heuristics to reduce the search space (Bagheri et al, 2018 ; Balkanski et al, 2020 ). Algorithms of this family do not scale well to large input sizes, as the number of queries made to the network grows exponentially.…”
Section: Related Workmentioning
confidence: 99%
“…Despite the growing number of Spiking Neural Network deployed on digital (Davies et al, 2018 ) and analog (Moradi et al, 2018 ) neuromorphic hardware, robustness to adversarial perturbations has received comparatively little attention by the research community. Some methods proposed for attacking binary inputs have focused on brute-force searches with heuristics to reduce the search space (Bagheri et al, 2018 ; Balkanski et al, 2020 ). Algorithms of this family do not scale well to large input sizes, as the number of queries made to the network grows exponentially.…”
Section: Related Workmentioning
confidence: 99%