Retinal optical coherence tomography (OCT) images are widely used in diagnosis of ocular conditions. However, random shift and orientation changes of the retinal layers in OCT B-scans yield to appearance variations across the scans. These variations reduce the accuracy of the algorithms applied in the analysis of OCT images. In this study, we propose a preprocessing step to compensate these variations and align B-scans. At first, by incorporating total variation (TV) loss in the well-known Unet model, we propose a TV-Unet model to accurately detect the retinal pigment epithelium (RPE) layer in each B-scan. Then we use the detected RPE layer in the alignment method to form a curvature curve and a reference line. A novel window transferring-based alignment approach is applied to force the curve points to form a straight line, while preserving the shape and size of the pathological lesions. Since detection of RPE layer is a crucial step in the proposed alignment method, we utilized various datasets to train and test the TV-Unet and provided a multimodal, device-independent OCT image alignment method. The TV-Unet localizes the RPE layer in OCT images with low boundary error (maximum of 1.94pixels) and high Dice coefficient (minimum of 0.98). Quantitative and qualitative results indicated that the proposed method can efficiently detects the RPE layer and align OCT images while preserving the structure and size of the retinal lesions (biomarkers) in the OCT scans.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.