2021
DOI: 10.3390/e23050552
|View full text |Cite
|
Sign up to set email alerts
|

Inference and Learning in a Latent Variable Model for Beta Distributed Interval Data

Abstract: Latent Variable Models (LVMs) are well established tools to accomplish a range of different data processing tasks. Applications exploit the ability of LVMs to identify latent data structure in order to improve data (e.g., through denoising) or to estimate the relation between latent causes and measurements in medical data. In the latter case, LVMs in the form of noisy-OR Bayes nets represent the standard approach to relate binary latents (which represent diseases) to binary observables (which represent symptom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 56 publications
0
3
0
Order By: Relevance
“…It is expected that the approach generalizes to other data sets, and if the number of patients increases in the future, it could be investigated if more sophisticated machine learning methods improve classification performance. For example, Mousavi et al [ 42 ] developed a classification approach that can deal with CAFPAs as continuous input variable, as well as with multiple findings being true for a patient. On the basis of the current CDSS framework, the integration of additional databases can be evaluated.…”
Section: Discussionmentioning
confidence: 99%
“…It is expected that the approach generalizes to other data sets, and if the number of patients increases in the future, it could be investigated if more sophisticated machine learning methods improve classification performance. For example, Mousavi et al [ 42 ] developed a classification approach that can deal with CAFPAs as continuous input variable, as well as with multiple findings being true for a patient. On the basis of the current CDSS framework, the integration of additional databases can be evaluated.…”
Section: Discussionmentioning
confidence: 99%
“…Sparse Coding (SC [76]), more generally, assumes that data can be accurately recovered using a set of few out of many possible compositional features. SC has been studied intensively for image processing and other tasks (e.g., [40,[77][78][79][80]).…”
Section: Algorithmsmentioning
confidence: 99%
“…Termed alternatively as lifelong or incremental learning, continual learning stands at the core of crafting AI systems that are adept in navigating the complexities of dynamic and evolving operational landscapes [3]. These landscapes are marked by variable data distributions, the advent of novel tasks, and the gradual obsolescence of older tasks, compelling a model's necessity to learn and adapt continuously without necessitating a reinitialization at every juncture of change [4][5][6]. The exposition delineates the criticality of continual learning in surmounting the constraints posed by static learning frameworks, particularly highlighting the phenomenon of catastrophic forgetting, where the assimilation of new information could inadvertently obliterate previously acquired knowledge [7].…”
Section: Introductionmentioning
confidence: 99%