2024
DOI: 10.3389/fmed.2024.1349373
|View full text |Cite
|
Sign up to set email alerts
|

Explainable AI-driven model for gastrointestinal cancer classification

Faisal Binzagr

Abstract: Although the detection procedure has been shown to be highly effective, there are several obstacles to overcome in the usage of AI-assisted cancer cell detection in clinical settings. These issues stem mostly from the failure to identify the underlying processes. Because AI-assisted diagnosis does not offer a clear decision-making process, doctors are dubious about it. In this instance, the advent of Explainable Artificial Intelligence (XAI), which offers explanations for prediction models, solves the AI black… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 62 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?