2023
DOI: 10.3390/math11132894
|View full text |Cite
|
Sign up to set email alerts
|

Inference Based on the Stochastic Expectation Maximization Algorithm in a Kumaraswamy Model with an Application to COVID-19 Cases in Chile

Abstract: Extensive research has been conducted on models that utilize the Kumaraswamy distribution to describe continuous variables with bounded support. In this study, we examine the trapezoidal Kumaraswamy model. Our objective is to propose a parameter estimation method for this model using the stochastic expectation maximization algorithm, which effectively tackles the challenges commonly encountered in the traditional expectation maximization algorithm. We then apply our results to the modeling of daily COVID-19 ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 57 publications
0
3
0
Order By: Relevance
“…For future research, we suggest conducting simulations with more inputs and outputs, as well as fewer decision-making units. It could also be worth considering a quantile regression model based on the Kumaraswamy distribution [38] instead of the beta regression model. This model could be a compelling alternative, especially when outliers are present in the response variable under consideration.…”
Section: Discussionmentioning
confidence: 99%
“…For future research, we suggest conducting simulations with more inputs and outputs, as well as fewer decision-making units. It could also be worth considering a quantile regression model based on the Kumaraswamy distribution [38] instead of the beta regression model. This model could be a compelling alternative, especially when outliers are present in the response variable under consideration.…”
Section: Discussionmentioning
confidence: 99%
“…In recent years. other approaches have also been developed, including those using artificial intelligence methods such as cluster analysis methods, e.g., clustering with the EM algorithm or the k-means method, as well as neural networks [9][10][11][12][13][14][15].…”
Section: Methods Of Grouping Objects Into Sets Of Similar Objectsmentioning
confidence: 99%
“…The fundamental concept underlying this algorithm is the assumption that the dataset originates from an unobservable discrete random variable U, which signifies the mixture component responsible for generating each observation yi. The algorithm iteratively fits these probabilities, updating them in each iteration until a convergence criterion is met [12].…”
Section: Em Algorithmmentioning
confidence: 99%