2022
DOI: 10.48550/arxiv.2204.04219
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards Reliable and Explainable AI Model for Solid Pulmonary Nodule Diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Moreover, XAI has been rigorously evaluated in diagnostic scenarios involving breast cancer [25], lung nodules [26], and brain tumors [27], among others. Across these evaluations, a diversity of XAI methodologies-including saliency maps, decision trees, and attribution maps-have been leveraged to illuminate the decision-making mechanisms of AI models.…”
Section: Related Workmentioning
confidence: 99%
“…Moreover, XAI has been rigorously evaluated in diagnostic scenarios involving breast cancer [25], lung nodules [26], and brain tumors [27], among others. Across these evaluations, a diversity of XAI methodologies-including saliency maps, decision trees, and attribution maps-have been leveraged to illuminate the decision-making mechanisms of AI models.…”
Section: Related Workmentioning
confidence: 99%
“…While global methods aim at explaining the general reasoning of an AI model, local methods have the goal of explaining why an AI model gave a certain output for a particular instance, i.e., the data of a particular patient or the diagnostic images belonging to a specific individual. The application of local XAI methods addresses the demand for precise, caseby-case explanations, which are paramount for clinical decision making, enhancing patient care, and fostering trust in AI systems among healthcare providers and patients [6][7][8][9][10][11][12][13].…”
Section: Introductionmentioning
confidence: 99%