2021
DOI: 10.32604/cmc.2021.016957
|View full text |Cite
|
Sign up to set email alerts
|

An Approach Using Fuzzy Sets and Boosting Techniques to Predict Liver Disease

Abstract: The aim of this research is to develop a mechanism to help medical practitioners predict and diagnose liver disease. Several systems have been proposed to help medical experts by diminishing error and increasing accuracy in diagnosing and predicting diseases. Among many existing methods, a few have considered the class imbalance issues of liver disorder datasets. As all the samples of liver disorder datasets are not useful, they do not contribute to learning about classifiers. A few samples might be redundant,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…Furthermore, the BiLSTM captures all semantic features and can generate decent writing of the remarks. is research paper compared its proposed BiLSTM architecture with existing Naive Bayesian, CNN [31], RNN, and LSTM [32]. Lastly, the opinion characteristic of a message is determined using a co-evolutionary network with SoftMax maps.…”
Section: Related Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, the BiLSTM captures all semantic features and can generate decent writing of the remarks. is research paper compared its proposed BiLSTM architecture with existing Naive Bayesian, CNN [31], RNN, and LSTM [32]. Lastly, the opinion characteristic of a message is determined using a co-evolutionary network with SoftMax maps.…”
Section: Related Researchmentioning
confidence: 99%
“…CC: calibrated classifier[26][27][28], VC: voting classifier, SVC: support vector classifier, DT: decision tree, ANNs: artificial neural networks[29], RNN: recurrent neural network, LSTM: long short-term memory, GNB: Gaussian naive Bayes, K-NN: K-nearest neighbor[30][31][32], ETC: extra trees classifier, NB: naïve Bayes, GBM: gradient boosting machine, RF: random forest, LR: logistic regression, and SGD: stochastic gradient descent.…”
mentioning
confidence: 99%
“…For evaluation of the different models, generally, the confusion matrix is prepared. Table 2 defines a simple representation of the confusion matrix [ 34 , 35 ], and it can classify between predicted and actual values. From the confusion matrix, we can derive different performance metrics, e.g., accuracy, precision, recall, sensitivity, and F-score.…”
Section: Evaluation Criteria For Effective Measure Of Modelmentioning
confidence: 99%
“…After this modification the resulting score that will be obtained will have the same relative proportions and can be still used efficaciously inside machine learning. (Kumar & Thakur, 2019, 2021a, 2021b) (D. Rajput, Thakur, Thakur, & Sahu, 2012 algorithm for finding the most similar examples:…”
Section: Euclidean Distancementioning
confidence: 99%