2018
DOI: 10.48550/arxiv.1806.10332
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
61
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 59 publications
(62 citation statements)
references
References 7 publications
0
61
0
Order By: Relevance
“…However, these methods are usually very slow and require huge resources for training. Other studies [8,12,26] also attempt to optimize multiple objectives like model size and accuracy. Nevertheless, their search process optimizes only on small tasks like CIFAR.…”
Section: Related Workmentioning
confidence: 99%
“…However, these methods are usually very slow and require huge resources for training. Other studies [8,12,26] also attempt to optimize multiple objectives like model size and accuracy. Nevertheless, their search process optimizes only on small tasks like CIFAR.…”
Section: Related Workmentioning
confidence: 99%
“…Perceptible metrics, such as latency and energy consumption, are subject to specification requirements and often more expressive and constrained than accuracy. Therefore, multi-objective optimization became more prominent and has been incorporated into the evaluation strategy [12,22]. However, while acknowledging the importance of perceptible metrics, these NAS designs use proxies like operations after having verified a linear relationship between them [5,32].…”
Section: Related Workmentioning
confidence: 99%
“…Hardware-aware NAS: Earlier NAS methods focused on maximizing accuracy under FLOPs constraints [22,25], but low FLOP count does not necessarily translate to hardware efficiency [10,18]. More recent methods incorporate hardware terms (e.g., runtime, power) into cell-based NAS formulations [10,12], but cell-based implementations are not hardware friendly [21]. Breaking away from cell-based assumptions in the search space encoding, recent work employs NAS over a generalized MobileNetV2-based design space introduced in [20].…”
Section: Related Workmentioning
confidence: 99%