2017
DOI: 10.1016/j.patcog.2017.01.018
|View full text |Cite
|
Sign up to set email alerts
|

Granger Causality Driven AHP for Feature Weighted kNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(9 citation statements)
references
References 42 publications
0
8
0
Order By: Relevance
“…It is often used to solve nonlinear problems, such as credit ratings and bank customer rankings, in which the collected data do not always follow the theoretical linear assumption, thus it should be one of the first choices when there is little or no prior knowledge about the distribution data. In addition, it can successfully reduce the influences of the variables on the experimental processes [13]. It has higher forecasting accuracy and has no assumptions for the collected data, and particularly, it is not sensitive to the outliers.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It is often used to solve nonlinear problems, such as credit ratings and bank customer rankings, in which the collected data do not always follow the theoretical linear assumption, thus it should be one of the first choices when there is little or no prior knowledge about the distribution data. In addition, it can successfully reduce the influences of the variables on the experimental processes [13]. It has higher forecasting accuracy and has no assumptions for the collected data, and particularly, it is not sensitive to the outliers.…”
Section: Introductionmentioning
confidence: 99%
“…Their proposed improved K-NN algorithm is applied to classification, regression, and missing data imputation with superior results. Bhattacharya et al [13] employs the weights obtained from the analytic hierarchy process (AHP) for different features to propose a weighted distance function for the K-NN algorithm. Their results demonstrate that the performance of the proposed K-NN classifier can receive improved results in terms of pairwise comparison of features.…”
Section: Introductionmentioning
confidence: 99%
“…robustness [19][20][21][22][23][24][25][26][27][28][29][30][31] . In biological sciences, examples of such knowledge integration include inference of biological networks 24,32 and causal pathway modelling 33,34 .…”
Section: Integration Of Mechanistic Immunological Knowledge Into a Mamentioning
confidence: 99%
“…For supervised learning, the training samples include inputs and outputs (i.e., features and class labels), which results in a better result than unsupervised learning in most cases [ 12 ]. The supervised algorithm commonly used includes decision tree (DT) [ 13 ], naïve Bayes (NB) [ 14 ], k-nearest neighbor (kNN) [ 15 17 ], neural networks (NNs) [ 18 , 19 ], and support vector machine (SVM) [ 20 22 ]. Among them, SVM was first formally proposed by Cortes and Vapnik in 1995.…”
Section: Introductionmentioning
confidence: 99%