2017
DOI: 10.1145/2996859
|View full text |Cite
|
Sign up to set email alerts
|

Extracting Maya Glyphs from Degraded Ancient Documents via Image Segmentation

Abstract: We present a system for automatically extracting hieroglyph strokes from images of degraded ancient Maya codices. Our system adopts a region-based image segmentation framework. Multi-resolution super-pixels are first extracted to represent each image. A SVM classifier is used to label each super-pixel region with a probability to belong to foreground glyph strokes. Pixel-wise probability maps from multiple super-pixel resolution scales are then aggregated to cope with various stroke widths and background noise… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…Extensive efforts have been made in this area, which can be broadly categorized into three main methods: handcrafted feature-based approaches, machine learning techniques, and deep learning methods. Image processing techniques have diverse applications in (i) Cultural Heritage (Hurtut et al 16 , Makridis and Daras 17 , Can et al 18 , Hu et al 19 ); and (ii) architectural heritage (Shalunts et al 20 , Mathias et al 21 , Chu and Tsai 22 , Goel et al 23 , Oses and Dornaika 24 , Zhang et al 25 , Xu et al 26 , Amato et al 27 ). These works highlight the versatility and significance of image processing in preserving and analyzing both cultural artifacts and architectural structures.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Extensive efforts have been made in this area, which can be broadly categorized into three main methods: handcrafted feature-based approaches, machine learning techniques, and deep learning methods. Image processing techniques have diverse applications in (i) Cultural Heritage (Hurtut et al 16 , Makridis and Daras 17 , Can et al 18 , Hu et al 19 ); and (ii) architectural heritage (Shalunts et al 20 , Mathias et al 21 , Chu and Tsai 22 , Goel et al 23 , Oses and Dornaika 24 , Zhang et al 25 , Xu et al 26 , Amato et al 27 ). These works highlight the versatility and significance of image processing in preserving and analyzing both cultural artifacts and architectural structures.…”
Section: Related Workmentioning
confidence: 99%
“…The SIGNIFICANCE project goes beyond this by integrating AI for proactive detection and prevention Color and Texture analysis 17 Basic techniques, but not directly applicable to illegal traffic monitoring. SIGNIFICANCE uses advanced AI techniques for comprehensive analysis Shape representations 18 , 19 Insightful in shape representation, less focused on illicit traffic. SIGNIFICANCE integrates these concepts with DL for improved artefact identification Architectural style & Local features 20 – 26 Focuses on style classification, not illicit traffic monitoring.…”
Section: Related Workmentioning
confidence: 99%
“…Rui Hu et.al [7], proposes a method to extract maya codices from degraded ancient images using region based segmentation. Super pixels having multi resolutions are extracted and classified to foreground or background pixels based on SVM classifier.…”
Section: Segmentation Using Traditional Algorithmsmentioning
confidence: 99%
“…Can et al studied visual analysis of Maya glyphs using both handcrafted and data-driven shape representations in a bag-of-words-based pipeline [15]. Similarly, Hu et al proposed a system for automatic extraction of hieroglyph strokes from images of degraded ancient Maya codices via a region-based image segmentation framework [40]. According to their experimental results, automatically extracted glyph strokes achieved comparable retrieval results to those obtained using glyphs manually segmented by epigraphers.…”
Section: Automated Processing Of Images From Heritage Sitesmentioning
confidence: 99%