2015
DOI: 10.1016/j.patcog.2014.12.015
|View full text |Cite
|
Sign up to set email alerts
|

Improved support vector machine algorithm for heterogeneous data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(19 citation statements)
references
References 27 publications
0
19
0
Order By: Relevance
“…What can still be mentioned is the method of probabilistic classifiers [25,26], or the recently popular method of support vectors [27][28][29][30]. Each of these methods has its advantages, but also limitations.…”
Section: Discussionmentioning
confidence: 99%
“…What can still be mentioned is the method of probabilistic classifiers [25,26], or the recently popular method of support vectors [27][28][29][30]. Each of these methods has its advantages, but also limitations.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, another set of methods, e.g., [14], [16], [29], [30], [31], [32], [33], learns metrics by using side-information such as labels, to guide metric learning for categorical data. The work in [29] is the first to consider label information for categorical data similarity, which uses labels to divide data into subsets and considers the attribute value distribution within these subsets.…”
Section: Related Workmentioning
confidence: 99%
“…However, KDML suffers from information loss due to only using the matching method to capture the relationships in the data. Instead of considering class as side information, the method in [33] captures the class information and classification model information simultaneously. However, it only maps a categorical value to a numerical value, which cannot well represent a categorical value when it has high-dimensional embedding.…”
Section: Related Workmentioning
confidence: 99%
“…The result of classification in one-versus-one solution is the label of this class, which has been selected most frequently by the binary classifiers. Support vector method, although relatively young, has already found a number of practical applications, like [36][37][38][39][40][41].…”
Section: Machine Learning Using Knn Cart Chaid and Ann Classifiersmentioning
confidence: 99%