2020
DOI: 10.3390/diagnostics10110972
|View full text |Cite
|
Sign up to set email alerts
|

Statistical Physics for Medical Diagnostics: Learning, Inference, and Optimization Algorithms

Abstract: It is widely believed that cooperation between clinicians and machines may address many of the decisional fragilities intrinsic to current medical practice. However, the realization of this potential will require more precise definitions of disease states as well as their dynamics and interactions. A careful probabilistic examination of symptoms and signs, including the molecular profiles of the relevant biochemical networks, will often be required for building an unbiased and efficient diagnostic approach. An… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 117 publications
(129 reference statements)
0
3
0
Order By: Relevance
“…When it comes to the pre-processing of data, it becomes less of a struggle to collect data, but more about how well formed the data is, in terms of being able to provide an accurate representation of the scenario, while forming the entire database to be consistent with standardized formats. Throughout many studies in the application of AI against human and algorithmic performance, a large risk that remains present in these studies would be that the models have limited to no clinical context, including the patient's medical history or prior laboratory findings, which can be critical in finding the proper treatment plans [18]. When it comes to the processing of data, there are a lot of times in which the systems connecting and supplying each other information automatically, are distributed through multiple heterogeneous and semantically incompatible systems, which leads to interoperability problems and inconsistencies [17].…”
Section: Data Pre-processingmentioning
confidence: 99%
“…When it comes to the pre-processing of data, it becomes less of a struggle to collect data, but more about how well formed the data is, in terms of being able to provide an accurate representation of the scenario, while forming the entire database to be consistent with standardized formats. Throughout many studies in the application of AI against human and algorithmic performance, a large risk that remains present in these studies would be that the models have limited to no clinical context, including the patient's medical history or prior laboratory findings, which can be critical in finding the proper treatment plans [18]. When it comes to the processing of data, there are a lot of times in which the systems connecting and supplying each other information automatically, are distributed through multiple heterogeneous and semantically incompatible systems, which leads to interoperability problems and inconsistencies [17].…”
Section: Data Pre-processingmentioning
confidence: 99%
“…Postalcioglu and Kesli used Naive Bayes method for pneumonia diagnosis [13]. Ramezanpour et al mentioned a collaboration between clinicians and machines at the decision stage [14]. Khan et al studied a classification method using deep learning for brain tumor type [15].…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, analytical and computational techniques of physics, in particular those derived from statistical physics of disordered systems, can be extended to large-scale problems, including machine learning, e.g., to analyze the weight space of deep neural networks [6,7].…”
Section: Introductionmentioning
confidence: 99%