2023
DOI: 10.1371/journal.pone.0279088
|View full text |Cite
|
Sign up to set email alerts
|

Expectations and attitudes towards medical artificial intelligence: A qualitative study in the field of stroke

Abstract: Introduction Artificial intelligence (AI) has the potential to transform clinical decision-making as we know it. Powered by sophisticated machine learning algorithms, clinical decision support systems (CDSS) can generate unprecedented amounts of predictive information about individuals’ health. Yet, despite the potential of these systems to promote proactive decision-making and improve health outcomes, their utility and impact remain poorly understood due to their still rare application in clinical practice. T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(27 citation statements)
references
References 68 publications
0
11
0
Order By: Relevance
“…Despite the promising results of previous studies in accurately predicting diagnosis, the black-box nature of these models poses a challenge for their adoption in clinical settings [18], as it can be challenging to comprehend the reasoning behind the model's predictions. This transparency is essential as it involves both the acknowledgment of AI usage and understanding how AI arrives at its conclusions or classifications [19].…”
Section: Models Explainibilitymentioning
confidence: 99%
“…Despite the promising results of previous studies in accurately predicting diagnosis, the black-box nature of these models poses a challenge for their adoption in clinical settings [18], as it can be challenging to comprehend the reasoning behind the model's predictions. This transparency is essential as it involves both the acknowledgment of AI usage and understanding how AI arrives at its conclusions or classifications [19].…”
Section: Models Explainibilitymentioning
confidence: 99%
“…Ironically, a misconception about “replacing doctors with AI” exists among contemporary health care practitioners, despite familiarity with its scope, application, capability, and limitation for years. The trend of disruptive integration of emerging information technologies into contemporary health care settings worldwide has led to fears of dehumanization 27 , 28 or depersonalization. 29 , 30 We have evidenced doctors using search engines to look up facts 31 , 32 and now acknowledging the use of AI chatbots, 33 , 34 such as Chat Generative Pretrained Transformer (ChatGPT).…”
Section: Health Care Providersmentioning
confidence: 99%
“…However, due to differing priorities and worldviews, most stakeholders appear unfamiliar with technological innovations and applications in a broader health care context. Therefore, disruptive technical integration into contemporary health care settings could create a fear of dehumanization 27 , 28 or depersonalization, 29 , 30 despite knowing the strengths and limits of such technologies.…”
Section: Health Care Providersmentioning
confidence: 99%
“…However, the introduction of AI technologies, capable of making rapid and precise decisions based on data analysis, has led to apprehensions about the possible displacement of human healthcare professionals ( Amann et al, 2023 ; Mousavi Baigi et al, 2023 ). The potential for AI to replace or significantly alter the roles traditionally held by medical practitioners raises numerous concerns within the healthcare community ( Aquino et al, 2023 ; O’Connor et al, 2023 ).…”
Section: Introductionmentioning
confidence: 99%