2020
DOI: 10.1167/tvst.9.2.4
|View full text |Cite
|
Sign up to set email alerts
|

Factors in Color Fundus Photographs That Can Be Used by Humans to Determine Sex of Individuals

Abstract: Purpose: Artificial intelligence (AI) can identify the sex of an individual from color fundus photographs (CFPs). However, the mechanism(s) involved in this identification has not been determined. This study was conducted to determine the information in CFPs that can be used to determine the sex of an individual. Methods:Prospective observational cross-sectional study of 112 eyes of 112 healthy volunteers. The following characteristics of CFPs were analyzed: the color of peripapillary area expressed by the mea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

2
28
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 30 publications
(32 citation statements)
references
References 45 publications
2
28
2
Order By: Relevance
“…We recently constructed a model to discriminate sex from the clinical parameters of CFP known by humans, such as angle or trajectory of retinal vessels [18][19][20][21][22][23], location and shape of the optic disc [24,25], and color intensity in the peripapillary area [26][27][28][29][30][31][32], albeit using the binomial regression with L2 regularization (Ridge regression) [33,34], in adults [35]. As a result, this allowed us to partially, at least, reproduce sex discrimination by deep learning with an AUC value of 77.9%.…”
Section: Introductionmentioning
confidence: 99%
“…We recently constructed a model to discriminate sex from the clinical parameters of CFP known by humans, such as angle or trajectory of retinal vessels [18][19][20][21][22][23], location and shape of the optic disc [24,25], and color intensity in the peripapillary area [26][27][28][29][30][31][32], albeit using the binomial regression with L2 regularization (Ridge regression) [33,34], in adults [35]. As a result, this allowed us to partially, at least, reproduce sex discrimination by deep learning with an AUC value of 77.9%.…”
Section: Introductionmentioning
confidence: 99%
“…We were intrigued by the findings of Yamashita et al in their recent paper "Factors in Color Fundus Photographs That Can Be Used by Humans to Determine Sex of Individuals." 1 As mentioned by the authors, the publication by Poplin et al in 2018 captivated the interest of many ophthalmologists when the authors proposed a deep neural network for sex differentiation on color fundus photography (CFP) with an area under the curve (AUC) of 97%. 2 However, to date, the results have yet to be replicated, the neural network model was not shared for independent validation nor explained in detail, and the published saliency maps did not highlight any specific features except for the fovea and the optic disc.…”
mentioning
confidence: 99%
“…The features identified by the authors were optic disc ovality ratio, papillomacular angle, retinal artery trajectory, and retinal vessel angles, the mean red, green, and blue colors of the peripapillary area expressed by a tessellation index. 1 In order to evaluate AI-based approaches for sex determination on CFP, we have performed meticulous analyses, including the reproducibility of the results from Poplin et al, the evaluation of a deep learning (DL) approach for assessing the informative relevance of previously known and human interpretable features compared to unknown/unstructured image information, and, finally, assessment of human performance against that of AI.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…We thank Dr Dieck and co-authors 1 for the comments on our paper. 2 They performed an important experiment. In that, ophthalmologists were told about some features of the difference in the fundus image of men and women in advance and judged the sex of each eye from her/his fundus photograph.…”
mentioning
confidence: 99%