2018
DOI: 10.1007/s12559-018-9598-1
|View full text |Cite
|
Sign up to set email alerts
|

Multi-View CNN Feature Aggregation with ELM Auto-Encoder for 3D Shape Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(18 citation statements)
references
References 22 publications
0
17
0
Order By: Relevance
“…Finally, the data was trained and classified with an ELM. The authors in [96] presented a computationally efficient method for image recognition. A new structure of multi-view CNNs and ELM-AE has been developed, which uses the composite advantages of VGG-19 deep architecture with the robust representation of ELM-AE features and the fast ELM classifier.…”
Section: Pre-trained Cnn In Other Application Domain For Feature Extraction and Elm For Fast Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, the data was trained and classified with an ELM. The authors in [96] presented a computationally efficient method for image recognition. A new structure of multi-view CNNs and ELM-AE has been developed, which uses the composite advantages of VGG-19 deep architecture with the robust representation of ELM-AE features and the fast ELM classifier.…”
Section: Pre-trained Cnn In Other Application Domain For Feature Extraction and Elm For Fast Learningmentioning
confidence: 99%
“…ResNet uses a signal that is the sum of the signal produced by the two previous convolutional layers plus the signal transmitted directly from the point before these layers. We found some works that used ResNet for transfer learning using the pre-trained weights from ILSVRC dataset in conjunction with ELM: [31], [115], [100], [66], [96].…”
Section: Pre-trained Cnn In Other Application Domain For Feature Extraction and Elm For Fast Learningmentioning
confidence: 99%
“…Nowadays, ELM series methods are applied in various scenarios including regression and classification, by accelerating computation significantly while keeping convincing performance at the same time [15]- [17], [29], [54]. The major driving force underneath is ELM omitting back-propagations compared with traditional deep learning methods.…”
Section: B Extreme Learning Machine (Elm)mentioning
confidence: 99%
“…Self-supervised learning: First of all, the bias between mixed classification performance on "old" classes and "novel" classes are largely alleviated due to the use of self-supervision, as the discrepancy, i.e., the classification accuracy gap between "old" classes and "novel" classes, 8.90% of proposed and 12.54% of variant (7) are lower than 18.96% of variant (1). Secondly, it is observed that self-supervised learning can always boost the performance on both clustering and mixed classification for "novel" classes by comparing variant (7) and proposed with variant (1).…”
Section: Ablation Studymentioning
confidence: 99%
“…5.5 Ablation study: Self-supervised learning. For variant (7), the pretraining and fine-tuning schemes in [81] are followed. .…”
mentioning
confidence: 99%