2021
DOI: 10.1007/s00521-020-05612-0
|View full text |Cite|
|
Sign up to set email alerts
|

A limited-size ensemble of homogeneous CNN/LSTMs for high-performance word classification

Abstract: The strength of long short-term memory neural networks (LSTMs) that have been applied is more located in handling sequences of variable length than in handling geometric variability of the image patterns. In this paper, an end-to-end convolutional LSTM neural network is used to handle both geometric variation and sequence variability. The best results for LSTMs are often based on large-scale training of an ensemble of network instances. We show that high performances can be reached on a common benchmark set by… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(31 citation statements)
references
References 75 publications
(131 reference statements)
0
11
0
Order By: Relevance
“…The network is summarized in Table II. This setup showed a good performance in the earlier work [18]. Moreover, the ensemble does not need handcrafted features or extensive network-architecture engineering efforts.…”
Section: B Recognizermentioning
confidence: 77%
See 4 more Smart Citations
“…The network is summarized in Table II. This setup showed a good performance in the earlier work [18]. Moreover, the ensemble does not need handcrafted features or extensive network-architecture engineering efforts.…”
Section: B Recognizermentioning
confidence: 77%
“…In this section, first, we explain four label-coding schemes. Then, we use the ensemble of five CNN-LSTMs suggested in [18], for acknowledging their effect. Such networks can handle sequence variability and geometric variation.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations