2020
DOI: 10.3389/fpls.2020.00499
|View full text |Cite
|
Sign up to set email alerts
|

A Fast and Automatic Method for Leaf Vein Network Extraction and Vein Density Measurement Based on Object-Oriented Classification

Abstract: Rapidly determining leaf vein network patterns and vein densities is biologically important and technically challenging. Current methods, however, are limited to vein contour extraction. Further image processing is difficult, and some leaf vein traits of interest therefore cannot be quantified. In this study, we proposed a novel method for the fast and accurate determination of leaf vein network patterns and vein density. Nine tree species with different leaf characteristics and vein types were applied to veri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(8 citation statements)
references
References 49 publications
0
8
0
Order By: Relevance
“…Because much of the discussion on leaf venation analysis has centered around computational methods of network extraction, most quantitative studies on leaf vein parameters have focused on leaves that are more straightforward in terms of vein visibility and leaf size, such as those of Diego and De Bacco (2021), Price et al (2011), and Zheng and Wang (2010). Often samples used are either whole leaves or one‐size‐fits‐all sections for all leaves, often with random sampling from various parts of the leaves (Larese et al, 2014; Zhu et al, 2020). The latter is problematic for comparative studies because venation features change across the surface of a leaf (Nardini et al, 2008; Zwieniecki et al, 2004).…”
Section: Discussionmentioning
confidence: 99%
“…Because much of the discussion on leaf venation analysis has centered around computational methods of network extraction, most quantitative studies on leaf vein parameters have focused on leaves that are more straightforward in terms of vein visibility and leaf size, such as those of Diego and De Bacco (2021), Price et al (2011), and Zheng and Wang (2010). Often samples used are either whole leaves or one‐size‐fits‐all sections for all leaves, often with random sampling from various parts of the leaves (Larese et al, 2014; Zhu et al, 2020). The latter is problematic for comparative studies because venation features change across the surface of a leaf (Nardini et al, 2008; Zwieniecki et al, 2004).…”
Section: Discussionmentioning
confidence: 99%
“…Since the manual measurement of leaf veins is a time-consuming and labor-intensive task, it has been of great interest to automate the process. In recent years, automatic leaf vein segmentation has been performed with classic computer vision approaches [8], [9], [10], [11], traditional machine-learning approaches with hand-crafted feature extraction [12], [13], [14] as well as few deep learning approaches [15].…”
Section: Related Work a Leaf Vein Extraction In Laboratory Settingsmentioning
confidence: 99%
“…In plant phenotyping, segmentation of individual leaves and their venation has seen sparse attention. In general, existing approaches use (i) experimental methods to chemically clear the leaf lamina and stain the veins to highlight the venation against the background [23,24], (ii) image preprocessing by greyscaling, aggregating specific color channels, or spatial rescaling [23][24][25][26][27], (iii) global filters and morphological operations (e.g., Odd Gabor filters, Hessian matrices, vesselness filters, and region merging) to obtain binary segmentations [23,[25][26][27][28], (iv) ensembles of scales and models to make aggregate predictions [24,28], and (v) require hundreds of manually-annotated training samples to produce accurate segmentation models [24]. However, these commonly encountered steps can bottleneck the scalability and accuracy of image-based plant phenotyping at population scale.…”
Section: Introductionmentioning
confidence: 99%