Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems 2023
DOI: 10.1145/3544548.3581075
|View full text |Cite
|
Sign up to set email alerts
|

Ignore, Trust, or Negotiate: Understanding Clinician Acceptance of AI-Based Treatment Recommendations in Health Care

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 35 publications
(5 citation statements)
references
References 65 publications
0
3
0
Order By: Relevance
“…The most important estimand during evaluation studies is the performance of the augmented clinical workflow as a whole, rather than the isolated system’s technical performance ( 49 ). Augmented workflows can cause unanticipated harm ( 50 ) and will only be effective if system recommendations are trusted and actioned by the end user; therefore, a thorough evaluation of safety ( 51 ), interpretability ( 52 ), and acceptability ( 53 ) is essential. The unforeseen poor performance of current sepsis decision support systems ( 54 ) emphasizes the importance of post-deployment surveillance ( 55 ), including surveillance of how system predictions are operationalized, which may change after initial trials.…”
Section: Current Approaches: Promise and Pitfallsmentioning
confidence: 99%
“…The most important estimand during evaluation studies is the performance of the augmented clinical workflow as a whole, rather than the isolated system’s technical performance ( 49 ). Augmented workflows can cause unanticipated harm ( 50 ) and will only be effective if system recommendations are trusted and actioned by the end user; therefore, a thorough evaluation of safety ( 51 ), interpretability ( 52 ), and acceptability ( 53 ) is essential. The unforeseen poor performance of current sepsis decision support systems ( 54 ) emphasizes the importance of post-deployment surveillance ( 55 ), including surveillance of how system predictions are operationalized, which may change after initial trials.…”
Section: Current Approaches: Promise and Pitfallsmentioning
confidence: 99%
“…Additionally, it is important to ensure that AI-based tools are designed to support rather than replace human clinicians and that AI systems do not perpetuate existing biases or increase health disparities with technology [67]. Ghosh et al, [68] examined the difficulties of using AI in healthcare and stated that AI tools need to be developed to help medical partitioners make informed decisions.…”
Section: Challengesmentioning
confidence: 99%
“…This might diminish the possibilities for patients and healthcare professionals to open up broader discussions and options for breast cancer‐related health care. On the other hand, it has been shown that predictive tools are not always immediately trusted and that the decisions are negotiated in a broader health care setting (Sivaraman et al, 2023). In any case, the biopolitical power here gets distribtued across the human and algorithmic nexus.…”
Section: Vignette: Nhs Predict Breast Cancer Prediction Toolmentioning
confidence: 99%