2007
DOI: 10.1109/icdar.2007.4376994
|View full text |Cite
|
Sign up to set email alerts
|

Arabic Handwriting Recognition Using Variable Duration HMM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 5 publications
0
9
0
Order By: Relevance
“…The approach used combines density based features and contour based features of sliding windows. In [12], each character is a state in Variable Duration Hidden Markov Models and has a variable duration to model a character model of multiple segments.…”
Section: Introductionmentioning
confidence: 99%
“…The approach used combines density based features and contour based features of sliding windows. In [12], each character is a state in Variable Duration Hidden Markov Models and has a variable duration to model a character model of multiple segments.…”
Section: Introductionmentioning
confidence: 99%
“…Certain approaches have been proposed to alleviate the strict dependency of segmentation-based HWR on correctly subdividing words into the characters they consist of. As one example, explicit over-segmentation is performed (cf., e.g., [70]) and based on some alignment technique (as, e.g., dynamic programming), the optimal "segmentation path" through the word to be recognized is extracted [44,58,91,110]. Alternatively, multiple segmentation solutions are generated by variants of the segmentation technique and the "best" solution w.r.t.…”
Section: Segmentation-free Versus Segmentation-based Recognitionmentioning
confidence: 99%
“…It is, however, quite cumbersome to define a coding of the inherently numeric feature representations of offline handwriting data into a symbol set. Therefore, today, only very few approaches still make use of discrete HMMs operating on either discretely modeled distributions in feature space [70] or a hand-crafted symbolic coding of the data [3]. Very rarely, discrete symbols are combined with continuous attributes for output modeling [127].…”
Section: Modeling Output Behaviormentioning
confidence: 99%
“…Classification is performed after pre-processing step to the values of the resulting features with Error Back Propagation Artificial Neural Network (EBPANN), from this process will be obtained calculation of recognition, all the data which will then determine the percentage of success of this method [12]. The tests in our neural network consists image processing phase, feature vector extraction phase, and three network phases that is Neural network structure phase, Error Back Propagation algorithm phase, Running neural network structure .Each character is represented as a 18x18 array of binary pixels image, with 0 representing "off" and 1 representing "on".…”
Section: ) Classificationmentioning
confidence: 99%