2021
DOI: 10.1016/j.patcog.2020.107526
|View full text |Cite
|
Sign up to set email alerts
|

A generalized weighted distance k-Nearest Neighbor for multi-label problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…In recent years, the map-reduce based distributed and parallel machine learning algorithms are the focus of the research community. The MapReduce framework has emerged as a powerful, robust, and distributed parallel programming model [22][23][24][25] provides a solution with good performance and efficient execution to large-scale data analytic applications including data mining, web page access ranking, graph analysis, image classification and bioinformatics [26][27][28][29][30][31][32][33][34][35][36][37][38][39][40][41][42][43][44]. Kolb et al [45] investigated the use of the MapReduce programming model for parallel entity resolution with automated data partitioning on real-world datasets.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, the map-reduce based distributed and parallel machine learning algorithms are the focus of the research community. The MapReduce framework has emerged as a powerful, robust, and distributed parallel programming model [22][23][24][25] provides a solution with good performance and efficient execution to large-scale data analytic applications including data mining, web page access ranking, graph analysis, image classification and bioinformatics [26][27][28][29][30][31][32][33][34][35][36][37][38][39][40][41][42][43][44]. Kolb et al [45] investigated the use of the MapReduce programming model for parallel entity resolution with automated data partitioning on real-world datasets.…”
Section: Introductionmentioning
confidence: 99%
“…The nearest neighbor (NN) [48] rule is a nonparametric method for classification based on instances. Taking into account the problem in Figure 2, a product in the MDD-DS p follows the definition {p 1 , p 2 , .…”
Section: Nearest-neighbors Modelmentioning
confidence: 99%
“…Algorithm adaptation methods adapt the existing single label classification algorithms to multi-label data [9]. For example, Rastin et al [36] proposed a prototype weighting method to adapt the distance measure based on the ML-kNN method [12], the prototype weights were adjusted by gradient ascent method in order to maximize the objective function as macro-F1 measure. Kouchaki et al [37] designed the multilabel random forest (MLRF) models for treating tuberculosis resistance classification and mutation ranking in medical.…”
Section: Related Workmentioning
confidence: 99%