2019 IEEE International Conference on Image Processing (ICIP) 2019
DOI: 10.1109/icip.2019.8803093
|View full text |Cite
|
Sign up to set email alerts
|

MLSNet: Resource-Efficient Adaptive Inference with Multi-Level Segmentation Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…[35] proposed a DNN modelbased manufacturing inspection system for smart industries, in which the edge devices are responsible for data collection while serving as the first exit point for model inference, while the cloud data center serves as the second exit point. Bolukbasi et al [36] introduced an adaptive trade-off between DNN model accuracy and model inference latency, which reduces inference latency and computational cost by exiting early in the early layers of the model. Leroux et al [37] presented a new architecture for cascaded networks that reduces the computational cost using an early exit mechanism in the recycling phase of the network.…”
Section: Dutta Et Al [25]mentioning
confidence: 99%
“…[35] proposed a DNN modelbased manufacturing inspection system for smart industries, in which the edge devices are responsible for data collection while serving as the first exit point for model inference, while the cloud data center serves as the second exit point. Bolukbasi et al [36] introduced an adaptive trade-off between DNN model accuracy and model inference latency, which reduces inference latency and computational cost by exiting early in the early layers of the model. Leroux et al [37] presented a new architecture for cascaded networks that reduces the computational cost using an early exit mechanism in the recycling phase of the network.…”
Section: Dutta Et Al [25]mentioning
confidence: 99%
“…Adaptive Early-Exit Inference. Prior studies investigating the adaptive early-exit inference technique have mainly focused on the architecture design (Panda, Sengupta, and Roy 2016;Teerapittayanon, McDanel, and Kung 2016;Stamoulis et al 2018;Teerapittayanon, McDanel, and Kung 2017;Wu et al 2020;Huang, Lai, and Chen 2017;Goetschalckx et al 2018;Cheng et al 2019;Yokoo, Iizuka, and Fukui 2019;Zhang et al 2019;Aketi, Panda, and Roy 2020;Leroux et al 2015;Yuan et al 2019) and training (Phuong and Lampert 2019;Li et al 2019b;Hu et al 2020) of the early-exit models. In particular, for studies of architecture design, most pioneers (Leroux et al 2015;Panda, Sengupta, and Roy 2016;Teerapittayanon, McDanel, and Kung 2016) design their architectures by moderately modifying the backbone architectures that have a single exit point.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, BranchyNet (Teerapittayanon, Mc-Danel, and Kung 2016) introduces additional branch classifiers at certain intermediate layers of a backbone architecture, thereby obtaining an architecture with multiple exit points. More recently, some more advanced designs have been proposed Zhang et al 2019;Yokoo, Iizuka, and Fukui 2019), where the computation reuse is maximized and during training, the interference between exit-points is reduced. For examples, Huang et al ) designed a two-dimensional multi-scale network architecture that improves the accuracy of early-exit classifiers by maintaining coarse level features throughout the networks; Zhang et al (Zhang et al 2019) proposed a scalable neural network framework that utilizes attention modules and knowledge distillation to learn good early-exit classifiers.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations