2019 IEEE International Conference on Image Processing (ICIP) 2019
DOI: 10.1109/icip.2019.8803348
|View full text |Cite
|
Sign up to set email alerts
|

Improving Document Binarization Via Adversarial Noise-Texture Augmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(24 citation statements)
references
References 18 publications
0
24
0
Order By: Relevance
“…Also, after the introduction of generative models, image binarization systems with generative models have been suggested [23]. For enhancing the performance of image binarization, Bhunia et al [24] and Zhao et al [25] develop binarization systems using a conditional GAN (cGAN).…”
Section: A Image Binarization and Background Blurring Methodsmentioning
confidence: 99%
“…Also, after the introduction of generative models, image binarization systems with generative models have been suggested [23]. For enhancing the performance of image binarization, Bhunia et al [24] and Zhao et al [25] develop binarization systems using a conditional GAN (cGAN).…”
Section: A Image Binarization and Background Blurring Methodsmentioning
confidence: 99%
“…We can solve classification tasks, for example, identifying diseases in plants [10], and even group images based on its pixel contents for effective image retrieval from large databases, just as implemented in [11]. Image generation tasks like document binarization [12] are more advanced, and adversarial networks can be implemented. Super resolution falls into the category of image generation, as the output is also an image, but with a higher resolution.…”
Section: Related Workmentioning
confidence: 99%
“…Similarly, a GAN discriminator is used to improve a prediction model in [178]. Bhunia et al [14] proposed a unique approach to train a binarization network using unpaired training data (i.e., the grayscale and binary images do not correspond) and achieved an impressive 97.8% FM on DIBCO13. To reduce the amount of needed training data, Krantz and Westphal [68] proposed a clustering method to ensure only diverse data are labeled.…”
Section: Deep Neural Networkmentioning
confidence: 99%
“…There is also limited labeled training data for deep models. Though some initial explorations around this issue have been made [14,55,68,165], there is still a large gap to address.…”
Section: Technical Challengesmentioning
confidence: 99%
See 1 more Smart Citation