2019
DOI: 10.1167/tvst.8.4.25
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Classification of Anterior Chamber Angle Using Ultrasound Biomicroscopy and Deep Learning

Abstract: Purpose To develop a software package for automated classification of anterior chamber angle of the eye by using ultrasound biomicroscopy. Methods Ultrasound biomicroscopy images were collected, and the trabecular-iris angle was manually measured and classified into three categories: open angle, narrow angle, and angle closure. Inception v3 was used as the classifying convolutional neural network and the algorithm was trained. Results With a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(15 citation statements)
references
References 20 publications
(25 reference statements)
0
15
0
Order By: Relevance
“…Class activation maps (CAMs) 20 are a common method where a heat map is generated by projecting the class specific weights of the output classification layer back to the feature maps of the last convolutional layer, thereby highlighting important regions for predicting a particular class. This method has been used in ophthalmic application previously to confirm CNN decision was based off the anterior chamber angle in categorizing angle closure, 21 areas of OCT B-scans associated with various diagnoses 22 , 23 and areas of segmentation error, 24 and area of OCT enface images associated with the diagnosis of glaucoma. 25 There exists several variants of this method that build off of the original CAM paper, 20 including: Grad-Cam, 26 Guided Grad-Cam, 26 Guided Grad-Cam++, 27 and GAIN.…”
Section: Introductionmentioning
confidence: 99%
“…Class activation maps (CAMs) 20 are a common method where a heat map is generated by projecting the class specific weights of the output classification layer back to the feature maps of the last convolutional layer, thereby highlighting important regions for predicting a particular class. This method has been used in ophthalmic application previously to confirm CNN decision was based off the anterior chamber angle in categorizing angle closure, 21 areas of OCT B-scans associated with various diagnoses 22 , 23 and areas of segmentation error, 24 and area of OCT enface images associated with the diagnosis of glaucoma. 25 There exists several variants of this method that build off of the original CAM paper, 20 including: Grad-Cam, 26 Guided Grad-Cam, 26 Guided Grad-Cam++, 27 and GAIN.…”
Section: Introductionmentioning
confidence: 99%
“…In summary, this algorithm offers a translational step toward generating meaningful UBM image analysis tools that may be clinically useful in pediatric anterior segment disease. The algorithm evaluates at a novel depth of UBM focus (the lens-iris diaphragm, rather than the angle, as has been previously studied using deep learning), 13 and a novel patient population (pediatric subjects). Future tasks may build upon this first step as a foundation.…”
Section: Discussionmentioning
confidence: 99%
“…Previous studies have demonstrated the ability of convolutional neural networks to identify anterior segment structures on smaller numbers of UBM images with transfer learning. 13 However, these studies have primarily aimed to assess acute angle closure glaucoma and glaucomatous changes in the adult anterior eye, and have not studied performance in the pediatric population dataset where techniques that improve performance on small datasets would have a significant benefit.…”
Section: Introductionmentioning
confidence: 99%
“…Automation can be further advanced with higher level computer programming in programs such as Python [96]. AI applications of UBM image analysis have been published among adult subjects [97], leaving pediatric UBM image analysis using AI an understudied area of great potential.…”
Section: Future Directionsmentioning
confidence: 99%