2022
DOI: 10.1109/access.2022.3214312
|View full text |Cite
|
Sign up to set email alerts
|

RoHNAS: A Neural Architecture Search Framework With Conjoint Optimization for Adversarial Robustness and Hardware Efficiency of Convolutional and Capsule Networks

Abstract: Neural Architecture Search (NAS) algorithms aim at finding efficient Deep Neural Network (DNN) architectures for a given application under given system constraints. DNNs are computationallycomplex as well as vulnerable to adversarial attacks. In order to address multiple design objectives, we propose RoHNAS, a novel NAS framework that jointly optimizes for adversarial-robustness and hardwareefficiency of DNNs executed on specialized hardware accelerators. Besides the traditional convolutional DNNs, RoHNAS addi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 11 publications
(15 citation statements)
references
References 44 publications
0
15
0
Order By: Relevance
“…In addition, the macro architecture of the same network type also varies. For example, NASCaps [26] changed its macro architecture to allow cells to be defined. Ref.…”
Section: Architecture Search Spacementioning
confidence: 99%
See 2 more Smart Citations
“…In addition, the macro architecture of the same network type also varies. For example, NASCaps [26] changed its macro architecture to allow cells to be defined. Ref.…”
Section: Architecture Search Spacementioning
confidence: 99%
“…Finally, the fitness of the offspring is assessed and the new generation is updated by adding the optimal mutations to the population. Some researchers have used evolutionary algorithms to integrate hardware constraints into NAS algorithms [26,33,42,43].…”
Section: Evolutionary Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, researchers have put considerable effort into developing effective neural architecture search methods to help automate the design of DNNs. Early work on neural architecture search adopted frameworks like reinforcement learning [10,12,17,18], evolutionary algorithms [9,11], and Bayesian optimization [2]. Because these methods operate on a discrete search space and need to perform many trials while searching for an optimal architecture in an exponentially-increasing hyperparameter space, they require thousands of GPU-hours to find optimal DNN architectures, which greatly limits their applicability.…”
Section: Related Workmentioning
confidence: 99%
“…The early versions of neural architecture search frameworks have resorted to reinforcement learning [10,12,17,18], evolutionary algorithms [9,11], and Bayesian optimization [2] to search for optimal DNN architectures. Unfortunately, these methods have time and space complexities that increase combinatorially with the number of options that are defined in the search space, requiring excessive amounts of computational resources.…”
Section: Introductionmentioning
confidence: 99%