2020
DOI: 10.48550/arxiv.2008.10309
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

LC-NAS: Latency Constrained Neural Architecture Search for Point Cloud Networks

Guohao Li,
Mengmeng Xu,
Silvio Giancola
et al.

Abstract: Point cloud architecture design has become a crucial problem for 3D deep learning. Several efforts exist to manually design architectures with high accuracy in point cloud tasks such as classification, segmentation, and detection. Recent progress in automatic Neural Architecture Search (NAS) minimizes the human effort in network design and optimizes high performing architectures. However, these efforts fail to consider important factors such as latency during inference. Latency is of high importance in time cr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 45 publications
(91 reference statements)
0
6
0
Order By: Relevance
“…There are also works on 3D shape classification [27,19], but their overall frameworks do not exceed that set by [25]. [47,20] is closer to our work, in the sense that it uses NAS to optimize for segmentation and detection on 3D scenes (KITTI [13]). But generalizing the terminology used in [23], we believe there is also a two-level hierarchy in 3D neural architecture designs, with the outer macro-level controlling the views of the data / features, and the inner micro-level being the specifics of the neural layers.…”
Section: Neural Architecture Searchmentioning
confidence: 86%
“…There are also works on 3D shape classification [27,19], but their overall frameworks do not exceed that set by [25]. [47,20] is closer to our work, in the sense that it uses NAS to optimize for segmentation and detection on 3D scenes (KITTI [13]). But generalizing the terminology used in [23], we believe there is also a two-level hierarchy in 3D neural architecture designs, with the outer macro-level controlling the views of the data / features, and the inner micro-level being the specifics of the neural layers.…”
Section: Neural Architecture Searchmentioning
confidence: 86%
“…For point-based methods, SGAS [169] explores a DARTS [187]like network topology with graph convolution layers as candidate operations in the search space. Built upon SGAS, LC-NAS [170] further incorporates the hardware feedback into the pipeline by training a differentiable latency regressor to predict the network latency on target platforms. The predicted latency acts as a regularization term and soft constraint during training.…”
Section: Efficient Point Cloud Processingmentioning
confidence: 99%
“…Here, #P denotes the number of parameters, #M denotes the number of MACs, and L denotes the measured latency (on a single NVIDIA GTX 1080 Ti GPU). * : numbers are from Li et al [86], which are measured on a single NVIDIA RTX 2080 GPU.…”
Section: Shapenetmentioning
confidence: 99%
“…We compare PVCNN with state-of-the-art pointbased methods including PointNet [3], PointNet++ [4], Deep LPN [83], SpiderCNN [18], PointCNN [6] and ResGCN [84]. We also compare PVNAS with automatically-designed point cloud segmentation networks including SGAS [85] and LC-NAS [86]. To ensure fair comparisons, we follow the original experiment setting in Mo et al [78] to train separate models for different object classes.…”
Section: Partnetmentioning
confidence: 99%