2019
DOI: 10.48550/arxiv.1906.04423
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

NAS-FCOS: Fast Neural Architecture Search for Object Detection

Abstract: The success of deep neural networks relies on significant architecture engineering. Recently neural architecture search (NAS) has emerged as a promise to greatly reduce manual effort in network design by automatically searching for optimal architectures, although typically such algorithms need an excessive amount of computational resources, e.g., a few thousand GPU-days. To date, on challenging vision tasks such as object detection, NAS, especially fast versions of NAS, is less studied.Here we propose to searc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
22
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 15 publications
(22 citation statements)
references
References 25 publications
0
22
0
Order By: Relevance
“…Even the automatic learning based method, a.k.a. NAS-FCOS [36] that performs 1.6 lower than the baseline. This phenomenon highlights the domain gap between general object and face detection.…”
Section: Introductionmentioning
confidence: 93%
See 4 more Smart Citations
“…Even the automatic learning based method, a.k.a. NAS-FCOS [36] that performs 1.6 lower than the baseline. This phenomenon highlights the domain gap between general object and face detection.…”
Section: Introductionmentioning
confidence: 93%
“…4.1.1 AutoFA. In order to address the limitation of previous NASbased FPN [10,36,37] when applied on the face domain, the above analysis motivates us to design a module that aggregates a feature by the similar-scale features instead of directly using those with large differences in scale.…”
Section: Search Space Of Autofa and Autofementioning
confidence: 99%
See 3 more Smart Citations