2018
DOI: 10.48550/arxiv.1801.04381
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MobileNetV2: Inverted Residuals and Linear Bottlenecks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
220
0
4

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 444 publications
(306 citation statements)
references
References 20 publications
0
220
0
4
Order By: Relevance
“…For classification, three pre-trained CNN models were used ResNet18 [35], MobileNet_V2 [36], and EfficientNet_B1 [37] and for semantic segmentation of hand region and background removal, three CNN models such as DenseNet201 Feature Pyramid Networks (FPN) [38], U-Net [39], and M-UNet [40] were used.…”
Section: Classification and Segmentation Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…For classification, three pre-trained CNN models were used ResNet18 [35], MobileNet_V2 [36], and EfficientNet_B1 [37] and for semantic segmentation of hand region and background removal, three CNN models such as DenseNet201 Feature Pyramid Networks (FPN) [38], U-Net [39], and M-UNet [40] were used.…”
Section: Classification and Segmentation Modelsmentioning
confidence: 99%
“…[35] provided evidence of vanishing gradients and decreasing accuracy after saturation. MobileNet_V2 [36] was designed to replace expensive convolution networks with a cheaper network. Ref.…”
Section: Classification and Segmentation Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Yolo v3 [47] TensorFlow v2 TinyYolo v3 [47] TensorFlow v2 MobileNet SSD v2 [48] Coral TPU Face recognition VGGFace (Senet50) [14] TensorFlow v1…”
Section: Image Classificationunclassified
“…State-or-the-art performance are achieved by designing wider and deeper CNNs [41,13,18]. However, the over-parameterization problem of CNNs prevents them from being applied to resource-limited devices, such as mobile phones and robotics [39,32]. Many approaches have been proposed to reduce the computation and storage cost of CNNs, such as quantization [10], matrix decomposition [48], network pruning [11,24,46,44,15], and knowledge distillation [16].…”
Section: Introductionmentioning
confidence: 99%