Eighth International Conference on Document Analysis and Recognition (ICDAR'05) 2005
DOI: 10.1109/icdar.2005.177
|View full text |Cite
|
Sign up to set email alerts
|

Online and offline character recognition using alignment to prototypes

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2006
2006
2019
2019

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 16 publications
0
12
0
Order By: Relevance
“…DTW had been proved to be an efficient method to calculate distance in online handwriting recognition [3][4][5]. Here we select the simple k-Nearest Neighbor (NN) classifier to compare the recognition performance of two distance metrics: DTW and AI-DTW.…”
Section: Figure 2 Examples Of Rotated Testing Samplesmentioning
confidence: 99%
See 2 more Smart Citations
“…DTW had been proved to be an efficient method to calculate distance in online handwriting recognition [3][4][5]. Here we select the simple k-Nearest Neighbor (NN) classifier to compare the recognition performance of two distance metrics: DTW and AI-DTW.…”
Section: Figure 2 Examples Of Rotated Testing Samplesmentioning
confidence: 99%
“…The main drawback of using NN classifier is its low speed. However, one can reduce the computation time greatly by using learned prototypes [4]. Due to page limitation, we have to omit the detailed discussion on this.…”
Section: Figure 2 Examples Of Rotated Testing Samplesmentioning
confidence: 99%
See 1 more Smart Citation
“…This process is often vital in order to construct reliable and discriminative models of classes that appear in distinctly different forms. For such reasons this type of clustering is a natural part of unsupervised training of model based online handwriting recognition classifiers independently of the recognition approach used [1,2,6,7,8]. The most common approaches here are k-Means based algorithms [3,7] and algorithms based on Hierarchical Agglomerative Clustering [1,2].…”
Section: Clusteringmentioning
confidence: 99%
“…Each weak classifier is a classifierF i , where F i is a 1D embedding. In [56], [57], [58], we have described alternative families of weak classifiers that can be used within the context of this algorithm.…”
Section: Choosing the Next Weak Classifier And Weightmentioning
confidence: 99%