2022
DOI: 10.1007/978-3-031-18292-1_3
|View full text |Cite
|
Sign up to set email alerts
|

Explainable Artificial Intelligence (XAI): Conception, Visualization and Assessment Approaches Towards Amenable XAI

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 17 publications
0
1
0
Order By: Relevance
“…Two explainable AI-based cardiac disease prediction experiments are compared. This comparison can help AI beginners choose the best techniques [26]. In another study, deep learning models in electronic health records (EHRs) are examined, along with interpretability in medical AI systems [27].…”
Section: Stroke Recognitionmentioning
confidence: 99%
“…Two explainable AI-based cardiac disease prediction experiments are compared. This comparison can help AI beginners choose the best techniques [26]. In another study, deep learning models in electronic health records (EHRs) are examined, along with interpretability in medical AI systems [27].…”
Section: Stroke Recognitionmentioning
confidence: 99%
“…There are some approaches to making deep learning and other AI techniques more explainable. Feature visualization 7 : The techniques are used to understand specific image patterns or features the AI model focuses on to arrive at its predictions. Heatmaps or saliency maps highlight the regions of interest within an image contributing to the AI model's decision. Grad‐CAM : Gradient‐weighted class activation mapping 8 (Grad‐CAM) is a technique that can generate visual explanations by analyzing the gradients of the target class concerning the convolutional feature maps of a deep learning model.…”
Section: Introductionmentioning
confidence: 99%
“…1. Feature visualization 7 : The techniques are used to understand specific image patterns or features the AI model focuses on to arrive at its predictions. Heatmaps or saliency maps highlight the regions of interest within an image contributing to the AI model's decision.…”
Section: Introductionmentioning
confidence: 99%