2016
DOI: 10.1080/08839514.2016.1193716
|View full text |Cite
|
Sign up to set email alerts
|

ParkinsoNET: Estimation of UPDRS Score Using Hubness-Aware Feedforward Neural Networks

Abstract: Parkinson's disease is a worldwide frequent neurodegenerative disorder with increasing incidence. Speech disturbance appears during the progression of the disease. UPDRS is a gold standard tool for diagnostic and follow up of the disease. We aim at estimating the UPDRS score based on biomedical voice recordings. In this paper, we study the hubness phenomenon in context of the UPDRS score estimation and propose hubness-aware error correction for feed-forward neural networks in order to increase the accuracy of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(3 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…Usually, the first few PCs can capture most of the variations present in the dataset. The first principal ✓ Babu and Suresh [17] ✓ Buza and Varga [18] ✓ Al-Fatlawi et al [19] ✓ Åström and Koker [20] ✓ Li et al [21] ✓ Avci and Dogantekin [22] ✓ Peterek et al [23] ✓ Froelich et al [24] ✓ Guo et al [25] ✓ ✓ Polat [26] ✓ ✓ ✓ Eskidere et al [27] ✓ ✓ Chen et al [28] ✓ ✓ ✓ ✓ Hariharan et al [29] ✓ ✓ ✓ ✓ Jain and Shetty [30] ✓ ✓ ✓ Behroozi and Sami [31] ✓ ✓ ✓ component contains the maximum variability present in the data and each succeeding component contains as much of the remaining variability as possible. Therefore, the least important PCs can be dropped to reduce the dimensionality of the dataset while retaining the most valuable parts of all the variables [46].…”
Section: Pcamentioning
confidence: 99%
See 1 more Smart Citation
“…Usually, the first few PCs can capture most of the variations present in the dataset. The first principal ✓ Babu and Suresh [17] ✓ Buza and Varga [18] ✓ Al-Fatlawi et al [19] ✓ Åström and Koker [20] ✓ Li et al [21] ✓ Avci and Dogantekin [22] ✓ Peterek et al [23] ✓ Froelich et al [24] ✓ Guo et al [25] ✓ ✓ Polat [26] ✓ ✓ ✓ Eskidere et al [27] ✓ ✓ Chen et al [28] ✓ ✓ ✓ ✓ Hariharan et al [29] ✓ ✓ ✓ ✓ Jain and Shetty [30] ✓ ✓ ✓ Behroozi and Sami [31] ✓ ✓ ✓ component contains the maximum variability present in the data and each succeeding component contains as much of the remaining variability as possible. Therefore, the least important PCs can be dropped to reduce the dimensionality of the dataset while retaining the most valuable parts of all the variables [46].…”
Section: Pcamentioning
confidence: 99%
“…Many researchers have worked for the diagnosis of PD using machine learning (ML) techniques. Authors of [14,15] have used support vector machine (SVM), authors of [16][17][18][19] have used neural network (NN), authors of [20,21] have used fuzzy logic (FL), authors of [22] have used genetic programming (GP), authors of [23] have used random forest (RF), authors of [24] have used decision tree (DT), authors of [25] have used GP and expectation maximization (EM), authors of [26] have used KNN, FL, and K-means (KM), authors of [27] have used SVM and NN, authors of [28] have used SVM, KNN, FL, and Principal component analysis (PCA), authors of [29] have used NN, EM, PCA, and linear discriminant analysis (LDA), authors of [30] have used KNN, NN, and association rule (AR), authors of [31] have used SVM, KNN, and naïve Bayes (NB). Table 1 shows the various studies conducted on the automated diagnosis of PD.…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, through this features the Machine Learning (ML) model performance is enhanced [15][16][17][18]. For PD classification, Decision Tree [19] Random Forest [20], Logistic Regression [21], SVM [22,23], and Neural Network [24][25][26] models are employed. We accessed the PD dataset from the University of California (UCI) [27], Irvine ML repository to conduct the experimental analysis.…”
Section: Introductionmentioning
confidence: 99%