2018
DOI: 10.1016/j.asoc.2018.04.020
|View full text |Cite
|
Sign up to set email alerts
|

Mixture of latent multinomial naive Bayes classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(26 citation statements)
references
References 22 publications
1
18
0
Order By: Relevance
“…We adopt these two criteria so as to validate the effectiveness of the proposed method. (iv) Testing results of RBFN with different kernel‐clustering ways: From the results, we find that for most cases, results under the new RBF are better than those without RBF. Comparisons between our method and other methods, for example, Naïve Bayes [14, 15], AdaBoost [16] (the standard ensemble algorithm), C4.5 [17, 18] (the most popular decision tree method), K ‐means method [19] (for determining the hidden structure directly), the k ‐nearest neighbour algorithm (KNN) [19], MLP [20] (a method that uses back propagation to estimate RBFNN), and probabilistic NN (PNN) [21] are also given.…”
Section: Methodsmentioning
confidence: 99%
“…We adopt these two criteria so as to validate the effectiveness of the proposed method. (iv) Testing results of RBFN with different kernel‐clustering ways: From the results, we find that for most cases, results under the new RBF are better than those without RBF. Comparisons between our method and other methods, for example, Naïve Bayes [14, 15], AdaBoost [16] (the standard ensemble algorithm), C4.5 [17, 18] (the most popular decision tree method), K ‐means method [19] (for determining the hidden structure directly), the k ‐nearest neighbour algorithm (KNN) [19], MLP [20] (a method that uses back propagation to estimate RBFNN), and probabilistic NN (PNN) [21] are also given.…”
Section: Methodsmentioning
confidence: 99%
“…The basic assumption to use it is that each feature (Tweet) is independent and equal (no effect on each other and the same weight). This kind of algorithm has been used in many studies (Harzevili and Alizadeh 2018;Jiang et al 2016), and it has proven that accuracy, speed, and no excessive computational resources are not the only advantages. Experiments demonstrate that Bayes classifier is useful in many complex real-world situations and can outperform many similar purpose classifiers (Baesens et al 2003).…”
Section: Naïve Bayes Classifiermentioning
confidence: 99%
“…Naive Bayes (NB) classifier is a simple but surprisingly powerful machine learning technique [ 11 , 12 ]. Despite its naive design and apparently oversimplified assumptions, NB has worked quite well in many complex real-world situations such as; real-time prediction, spam filtering, weather forecast, and medical diagnosis [ 13 , 14 , 15 ].…”
Section: Introductionmentioning
confidence: 99%