Machine Learning and Artificial Intelligence (AI) more broadly have great immediate and future potential for transforming almost all aspects of medicine. However, in many applications, even outside medicine, a lack of transparency in AI applications has become increasingly problematic. This is particularly pronounced where users need to interpret the output of AI systems. Explainable AI (XAI) provides a rationale that allows users to understand why a system has produced a given output. The output can then be interpreted within a given context. One area that is in great need of XAI is that of Clinical Decision Support Systems (CDSSs). These systems support medical practitioners in their clinic decision-making and in the absence of explainability may lead to issues of under or over-reliance. Providing explanations for how recommendations are arrived at will allow practitioners to make more nuanced, and in some cases, life-saving decisions. The need for XAI in CDSS, and the medical field in general, is amplified by the need for ethical and fair decision-making and the fact that AI trained with historical data can be a reinforcement agent of historical actions and biases that should be uncovered. We performed a systematic literature review of work to-date in the application of XAI in CDSS. Tabular data processing XAI-enabled systems are the most common, while XAI-enabled CDSS for text analysis are the least common in literature. There is more interest in developers for the provision of local explanations, while there was almost a balance between post-hoc and ante-hoc explanations, as well as between model-specific and model-agnostic techniques. Studies reported benefits of the use of XAI such as the fact that it could enhance decision confidence for clinicians, or generate the hypothesis about causality, which ultimately leads to increased trustworthiness and acceptability of the system and potential for its incorporation in the clinical workflow. However, we found an overall distinct lack of application of XAI in the context of CDSS and, in particular, a lack of user studies exploring the needs of clinicians. We propose some guidelines for the implementation of XAI in CDSS and explore some opportunities, challenges, and future research needs.
Nitrogen (N) deposition significantly affects the soil carbon (C) cycle process of forests. However, the influence of different types of N on it still remained unclear. In this work, ammonium nitrate was selected as an inorganic N (IN) source, while urea and glycine were chosen as organic N (ON) sources. Different ratios of IN to ON (1 : 4, 2 : 3, 3 : 2, 4 : 1, and 5 : 0) were mixed with equal total amounts and then used to fertilize temperate forest soils for 2 years. Results showed that IN deposition inhibited soil C cycle processes, such as soil respiration, soil organic C decomposition, and enzymatic activities, and induced the accumulation of recalcitrant organic C. By contrast, ON deposition promoted these processes. Addition of ON also resulted in accelerated transformation of recalcitrant compounds into labile compounds and increased CO2 efflux. Meanwhile, greater ON deposition may convert C sequestration in forest soils into C source. These results indicated the importance of the IN to ON ratio in controlling the soil C cycle, which can consequently change the ecological effect of N deposition.
Different from the traditional ways for enhancing the dielectric properties of polymers by compositing with rigid electronic conductors, here an alternative strategy is reported via introducing ionically conductive liquid electrolytes as functional fillers. Dielectric constant has significantly improved (up to 600%) by liquid electrolyte inclusions in an elastomer matrix. Moreover, by taking advantage of the inherent transparency of liquid electrolyte fillers, high transparency, good stretchability, and high dielectric constant are achieved simultaneously. Using the composite elastomer, the fabrication of highly sensitive strain sensor is demonstrated with 5–6 times higher sensitivity than the pristine elastomer, and flexible electroluminescent device with greatly lowed driving voltage. The strategy provides new opportunities for novel electroactive polymers, including flexible touchscreen panels and displays, biomimetic soft machines, and smart optics.
Gestational Diabetes Mellitus (GDM), a common pregnancy complication associated with many maternal and neonatal consequences, is increased in mothers with overweight and obesity. Interventions initiated early in pregnancy can reduce the rate of GDM in these women, however, untargeted interventions can be costly and time-consuming. We have developed an explainable machine learning-based clinical decision support system (CDSS) to identify at-risk women in need of targeted pregnancy intervention. Maternal characteristics and blood biomarkers at baseline from the PEARS study were used. After appropriate data preparation, synthetic minority oversampling technique and feature selection, five machine learning algorithms were applied with five-fold cross-validated grid search optimising the balanced accuracy. Our models were explained with Shapley additive explanations to increase the trustworthiness and acceptability of the system. We developed multiple models for different use cases: theoretical (AUC-PR 0.485, AUC-ROC 0.792), GDM screening during a normal antenatal visit (AUC-PR 0.208, AUC-ROC 0.659), and remote GDM risk assessment (AUC-PR 0.199, AUC-ROC 0.656). Our models have been implemented as a web server that is publicly available for academic use. Our explainable CDSS demonstrates the potential to assist clinicians in screening at risk patients who may benefit from early pregnancy GDM prevention strategies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.