2018
DOI: 10.1007/s10032-017-0293-7
|View full text |Cite
|
Sign up to set email alerts
|

Efficient document image binarization using heterogeneous computing and parameter tuning

Abstract: In the context of historical document analysis, image binarization is a first important step, which separates foreground from background, despite common image degradations, such as faded ink, stains, or bleed-through. Fast binarization has great significance when analyzing vast archives of document images, since even small inefficiencies can quickly accumulate to years of wasted execution time. Therefore, efficient binarization is especially relevant to companies and government institutions, who want to analyz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 49 publications
0
9
0
Order By: Relevance
“…MoG mixture of Gaussians, CRF conditional random field, ML machine learning, SVM support vector machine, FCN fully convolutional network, LSTM long short-term memory Chattopadhyay [23] Supervised tuning SVM classifier chooses best binarization algorithm Xiong [176] Supervised tuning SVM chooses global threshold for image blocks Westphal [170] Supervised tuning Predicts parameters for Howe [50] Messaoud [87] Supervised tuning Detects 4 types of image noise to choose parameters Ntirogiannis [111] Unsuper. tuning Estimates local window sizes based on stroke width Boiangiu [15] Unsuper.…”
Section: Otsumentioning
confidence: 99%
See 2 more Smart Citations
“…MoG mixture of Gaussians, CRF conditional random field, ML machine learning, SVM support vector machine, FCN fully convolutional network, LSTM long short-term memory Chattopadhyay [23] Supervised tuning SVM classifier chooses best binarization algorithm Xiong [176] Supervised tuning SVM chooses global threshold for image blocks Westphal [170] Supervised tuning Predicts parameters for Howe [50] Messaoud [87] Supervised tuning Detects 4 types of image noise to choose parameters Ntirogiannis [111] Unsuper. tuning Estimates local window sizes based on stroke width Boiangiu [15] Unsuper.…”
Section: Otsumentioning
confidence: 99%
“…A similar approach was used in [176], but instead a global threshold is predicted for each block. Westphal et al [170] replace the unsupervised parameter tuning of Howe [50] with a supervised approach and report improved performance.…”
Section: Supervisedmentioning
confidence: 99%
See 1 more Smart Citation
“…Since the extraction of TP-R and G-R requires binarized input images, we binarize all word images using Howe's binarization algorithm [12]. We predict the required binarization parameters using the approach by Westphal et al [28], which has been trained on the dataset of the 2013 document image binarization contest [17]. After visual inspection of one image, we adjust one of the predicted parameters for the Alvermann Konzilsprotokolle dataset and the George Washington dataset to increase the recall.…”
Section: Experiments Setupmentioning
confidence: 99%
“…In order to estimate House's method parameters in a timely fashion, Westphal et al . [29] proposed to predict the parameters using multivariate regression based on four image features. Using these four image features, random forests were used to learn a mapping from these features to suitable parameters for a given image.…”
Section: Related Workmentioning
confidence: 99%