2010
DOI: 10.1007/s10916-010-9621-x
|View full text |Cite
|
Sign up to set email alerts
|

A Study on Hepatitis Disease Diagnosis Using Probabilistic Neural Network

Abstract: Hepatitis is a major public health problem all around the world. Hepatitis disease diagnosis via proper interpretation of the hepatitis data is an important classification problem. In this study, a comparative hepatitis disease diagnosis study was realized. For this purpose, a probabilistic neural network structure was used. The results of the study were compared with the results of the previous studies reported focusing on hepatitis disease diagnosis and using same UCI machine learning database.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0
1

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(21 citation statements)
references
References 15 publications
0
19
0
1
Order By: Relevance
“…Furthermore, the classification outcomes of 3SVM approach are compared with results of other published approaches. Table IV lists comparisons of 30 methods proposed in literature as listed in [6], [7], [23] and [4]. From comparisons, the 3SVM gives the better results than other methods proposed in literature, where the 3SVM enhances the performance of classification and the accuracy rate increases with 2.5%, 1.98% and 7.5% from the recently published methods [32], [7] and [4].…”
Section: A Results and Discussionmentioning
confidence: 96%
See 1 more Smart Citation
“…Furthermore, the classification outcomes of 3SVM approach are compared with results of other published approaches. Table IV lists comparisons of 30 methods proposed in literature as listed in [6], [7], [23] and [4]. From comparisons, the 3SVM gives the better results than other methods proposed in literature, where the 3SVM enhances the performance of classification and the accuracy rate increases with 2.5%, 1.98% and 7.5% from the recently published methods [32], [7] and [4].…”
Section: A Results and Discussionmentioning
confidence: 96%
“…The obtained accuracy is (96.25%), which the best accuracy rate when compared with other methods. In recent study presented by [4], the authors summarized the most works in the area of hepatitis disease diagnosis, and proposed a new method by employing Probabilistic Neural Network structure called PNN (10xFC), the results that obtained is 91.25%.…”
Section: Introductionmentioning
confidence: 99%
“…As it is seen in this Table, classification process with EMELM and without feature reduction shows better results in comparison with other methods; the same as ELM. [12] 82.00 AIS Ster and dobnikar [16] 86.40 LDA Ster and dobnikar [16] 89.20 GAM De Bock et al [5] 91.25 PNN Bascil and oztekin [1] 91.87 MLP Bascil and temurtas [2] 93.75 ELM Kaya and Uyar [11] 93.75 EMELM Classifier [8] The learning method of EMELM has less complexity than ELM; as in ELM while applying the changes in the architecture of the network, the weights of previous outputs are ignored and new output weight is calculated using entirely new hidden layer output matrix. But in EMELM, as the network grows in each stage, hidden layer output matrix corresponding to number of nodes added is calculated and output weights are increasingly updated.…”
Section: Analysis and Experimental Resultsmentioning
confidence: 99%
“…The feature reduction using the PCA method is described in the following. For a given p-dimensional data set X, m principal axes T 1 , T 2 , … , T m where 1 ≤ m ≤ p, are orthonormal axes with maximum variance in projected space that can be given by the m leading eigenvectors of the sample covariance matrix Equation (1).…”
Section: Principal Component Analysis Algorithmmentioning
confidence: 99%
“…Each hidden node models a Gaussian function centered on a training sample. In the summation layer, there is an output unit for each class, which is connected to all of the hidden units belonging to the same class, without connections to other neurons [4]. The outputs are proportional to the density functions of different classes and normalized to form a sum equal to 1.…”
Section: Pattern (Hidden) Layermentioning
confidence: 99%