2022
DOI: 10.1016/j.knosys.2022.109723
|View full text |Cite
|
Sign up to set email alerts
|

A novel compact design of convolutional layers with spatial transformation towards lower-rank representation for image classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…To validate the effectiveness of our approach, we compare it with several recent low-rank tensor decomposition approaches, vector quantization approaches, scalar quantization approaches and network pruning methods. The low-rank tensor decomposition approaches include SVD [42], LCT [34], TDNR [43], HALOC [44], Maestro [45] and ELRT [46]. The vector quantization approaches include PQF [16] and BGD [47].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To validate the effectiveness of our approach, we compare it with several recent low-rank tensor decomposition approaches, vector quantization approaches, scalar quantization approaches and network pruning methods. The low-rank tensor decomposition approaches include SVD [42], LCT [34], TDNR [43], HALOC [44], Maestro [45] and ELRT [46]. The vector quantization approaches include PQF [16] and BGD [47].…”
Section: Methodsmentioning
confidence: 99%
“…The authors of [33] showed that, with a suitable formulation, determining the optimal rank of each layer was amenable to a mixed discrete continuous optimization jointly over the ranks and matrix elements. The authors of [34] proposed a novel compact design of convolutional layers with a spatial transformation towards a lower-rank representation, and they applied trainable spatial transformations to low-rank convolutional kernels in a predefined Tucker product form to enhance the versatility of convolutional kernels. These approaches can improve the identification accuracy of a compressed network to some extent.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To validate the advancement and effectiveness of the TR-BO method, TR-BO result are compared with seven stateof-the-art compression methods (i.e., TRN [12] , TR-RL [8] , PSTRN [5] , TRP [13] , LC [14] , LCT [15] , LCCUR [16] ) and show good performance on two datasets with two networks, as shown in Tables 2 and 3 (Bold represents best performance). Compared to the PSTRN and TR-RL methods, the proposed TR-BO method achieves optimal performance in terms of accuracy, number of parameters, and training time at any compression ratio.…”
Section: Image Classificationmentioning
confidence: 99%