2021
DOI: 10.3390/app11135798
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric Bayesian Learning of Infinite Multivariate Generalized Normal Mixture Models and Its Applications

Abstract: This paper addresses the problem of data vectors modeling, classification and recognition using infinite mixture models, which have been shown to be an effective alternative to finite mixtures in terms of selecting the optimal number of clusters. In this work, we propose a novel approach for localized features modelling using an infinite mixture model based on multivariate generalized Normal distributions (inMGNM). The statistical mixture is learned via a nonparametric MCMC-based Bayesian approach in order to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 68 publications
0
2
0
Order By: Relevance
“…This flexibility is particularly advantageous in scenarios where the true number of clusters is unknown or varies. This makes Bayesian nonparametric methods well-suited for diverse real-world applications [34,35]. Furthermore, these techniques inherently capture uncertainty in the clustering process, providing a probabilistic assignment of nodes to clusters.…”
Section: Literaturementioning
confidence: 99%
“…This flexibility is particularly advantageous in scenarios where the true number of clusters is unknown or varies. This makes Bayesian nonparametric methods well-suited for diverse real-world applications [34,35]. Furthermore, these techniques inherently capture uncertainty in the clustering process, providing a probabilistic assignment of nodes to clusters.…”
Section: Literaturementioning
confidence: 99%
“…A study proposed a hypothesis that further improving the accuracy of DNN feature selection techniques can be used ( 6 , 7 ). Feature selection is the process of acquiring relevant information and discarding irrelevant ones ( 8 ). The feature selection methods can be either supervised or unsupervised, and the supervised method can be divided into the wrapper, filter or intrinsic methods.…”
Section: Introductionmentioning
confidence: 99%