2022
DOI: 10.3390/math10121994
|View full text |Cite
|
Sign up to set email alerts
|

Explainable Machine Learning for Longitudinal Multi-Omic Microbiome

Abstract: Over the years, research studies have shown there is a key connection between the microbial community in the gut, genes, and immune system. Understanding this association may help discover the cause of complex chronic idiopathic disorders such as inflammatory bowel disease. Even though important efforts have been put into the field, the functions, dynamics, and causation of dysbiosis state performed by the microbial community remains unclear. Machine learning models can help elucidate important connections and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 74 publications
0
9
0
Order By: Relevance
“…Integrating explainable artificial intelligence with digital health data is gaining momentum in precision medicine, addressing the need for transparent and understandable models essential for clinical applicability [ 19 , 51 , 52 ]. As machine-learning models become more complex, interpretability is crucial in clinical contexts such as microbiome research [ 51 , 53 ].…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Integrating explainable artificial intelligence with digital health data is gaining momentum in precision medicine, addressing the need for transparent and understandable models essential for clinical applicability [ 19 , 51 , 52 ]. As machine-learning models become more complex, interpretability is crucial in clinical contexts such as microbiome research [ 51 , 53 ].…”
Section: Resultsmentioning
confidence: 99%
“…Integrating explainable artificial intelligence with digital health data is gaining momentum in precision medicine, addressing the need for transparent and understandable models essential for clinical applicability [ 19 , 51 , 52 ]. As machine-learning models become more complex, interpretability is crucial in clinical contexts such as microbiome research [ 51 , 53 ]. Explainable artificial-intelligence applications can help predict an Alzheimer’s disease diagnosis in a pool of patients with mild impairments, showcasing how interpretable machine-learning algorithms can help explain complex patterns that inform individual patient predictions [ 54 , 55 ].…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…proposed a sparse neural encoder–decoder network which not only predicts metabolite abundances from microbiome data but also allows to interpret microbe–metabolite links from the hidden layer of the network [ 51 ]. Given the dynamic nature of microbiome composition in the human gut, several packages were developed to specifically analyze time-series metabolomics data, such as MDITRE [ 52 ] and CGBayesNets [ 53 ]. Both of these tools combine Bayesian approaches with deep learning to predict human-interpretable rules for host status given taxonomic information.…”
Section: Data-driven Approaches: Machine and Deep Learningmentioning
confidence: 99%