2023
DOI: 10.3389/fpls.2023.1108355
|View full text |Cite
|
Sign up to set email alerts
|

Self-supervised maize kernel classification and segmentation for embryo identification

Abstract: IntroductionComputer vision and deep learning (DL) techniques have succeeded in a wide range of diverse fields. Recently, these techniques have been successfully deployed in plant science applications to address food security, productivity, and environmental sustainability problems for a growing global population. However, training these DL models often necessitates the large-scale manual annotation of data which frequently becomes a tedious and time-and-resource- intensive process. Recent advances in self-sup… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 51 publications
(53 reference statements)
0
1
0
Order By: Relevance
“…trained the CNN model that resulted in 94.39% accuracy and 97.07% sensitivity. It seems that CNN model can be further trained to achieve optimum accuracy by adding layers of networks and enriching the input data (images) having various R1-nj expressions worldwide Dong et al (2023). provided the advantages of self-supervised learning (SSL) methods in training deep learning models which often require extensive manual annotation of data.…”
mentioning
confidence: 99%
“…trained the CNN model that resulted in 94.39% accuracy and 97.07% sensitivity. It seems that CNN model can be further trained to achieve optimum accuracy by adding layers of networks and enriching the input data (images) having various R1-nj expressions worldwide Dong et al (2023). provided the advantages of self-supervised learning (SSL) methods in training deep learning models which often require extensive manual annotation of data.…”
mentioning
confidence: 99%