2022
DOI: 10.1016/s2589-7500(22)00063-2
|View full text |Cite
|
Sign up to set email alerts
|

AI recognition of patient race in medical imaging: a modelling study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
114
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 173 publications
(120 citation statements)
references
References 35 publications
3
114
0
2
Order By: Relevance
“…These discussion points should be placed in the context of algorithmic bias and ethical concerns, which can place undue risk on both the clinician stakeholder and underrepresented patient subgroups. For instance, several previous medical AI studies have identified patient demographic features such as self-reported race as potential model confounders ( 74 ). In other cases, they may underdiagnose historically underserved races at a higher rate (one popular example is skin lesion image classifiers which screen for Melanoma) ( 75 79 ).…”
Section: Discussionmentioning
confidence: 99%
“…These discussion points should be placed in the context of algorithmic bias and ethical concerns, which can place undue risk on both the clinician stakeholder and underrepresented patient subgroups. For instance, several previous medical AI studies have identified patient demographic features such as self-reported race as potential model confounders ( 74 ). In other cases, they may underdiagnose historically underserved races at a higher rate (one popular example is skin lesion image classifiers which screen for Melanoma) ( 75 79 ).…”
Section: Discussionmentioning
confidence: 99%
“…When the demographics of such databases do not match that of the target population, the trained model may be biased, presenting lower performance in the underrepresented groups 11 . Indeed, in chest X-ray pathology classification, only few of the major available datasets in that domain include information about race/ethnicity and, in cases where this information is included, databases tend to be skewed in terms of those attributes 26 .…”
Section: Three Reasons Behind Biased Systems: Data Models and Peoplementioning
confidence: 99%
“…This fact, combined with experimental studies suggesting that race/ethnicity imbalance in MI databases may be one of the reasons behind unequal performance 11 , calls for action towards building truly international databases which include patients from low income countries. This issue becomes even more relevant in the light of recent findings which confirm that AI can trivially predict protected attributes from medical images, even in a setting where clinical experts cannot like race/ethnicity in chest X-ray 26 and ancestry in histologic images 43 . While this fact by itself does not immediately mean that systems will be biased, in combination with a greedy optimization scheme in a setting with strong data imbalance, it may provide a direct vector for the reproduction of pre-existing racial disparities.…”
Section: Comment Nature Communicationsmentioning
confidence: 99%
“…Additionally, substantial data bias may lead to unforeseen disparities in patient care as AI may stratify based on unintentional subgroups. Gichoya et al [ 33 ] observed that chest x-ray AI models can be used to predict patient’s race with image features physicians were unaware of. The implication is that bias is unavoidable even when looking at data that appears agnostic, such as chest x-rays.…”
Section: Limitations Of Ethical Challengesmentioning
confidence: 99%