2009 International Conference on Machine Learning and Cybernetics 2009
DOI: 10.1109/icmlc.2009.5212413
|View full text |Cite
|
Sign up to set email alerts
|

A novel learning model-Kernel Granular Support Vector Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 4 publications
0
6
0
Order By: Relevance
“…There are three main types of Machine Learning [60]- [66] represented in figure 7: In supervised learning, the algorithms are based on already categorized datasets, in order to understand the criteria used for classification and reproducing them [67]- [70]. In unsupervised learning, algorithms are trained from raw data, from which they try to extract patterns [71]- [76]. Finally, in reinforcement learning, the algorithm functions as an autonomous agent, which observes its environment and learns as it interacts with it [73], [75].…”
Section: Machine Learningmentioning
confidence: 99%
“…There are three main types of Machine Learning [60]- [66] represented in figure 7: In supervised learning, the algorithms are based on already categorized datasets, in order to understand the criteria used for classification and reproducing them [67]- [70]. In unsupervised learning, algorithms are trained from raw data, from which they try to extract patterns [71]- [76]. Finally, in reinforcement learning, the algorithm functions as an autonomous agent, which observes its environment and learns as it interacts with it [73], [75].…”
Section: Machine Learningmentioning
confidence: 99%
“…SVM is widely used in text classification [33,42], marketing, pattern recognition, and medical diagnosis [43]. A lot of further research is done, GSVM (granular support vector machines) [44][45][46], FSVM (fuzzy support vector machines) [47][48][49], TWSVMs (twin support vector machines) [50][51][52], VaR-SVM (value-at-risk support vector machines) [53], and RSVM (ranking support vector machines) [54]. [55] divide data into meaningful groups (see Figure 3) so that patterns in the same group are similar in some sense and patterns in different group are dissimilar in the same sense.…”
Section: Classificationmentioning
confidence: 99%
“…Therefore, data distribution errors are inevitable. These two aspects may reduce the generalization ability of GSVM [17].…”
Section: Introductionmentioning
confidence: 96%
“…It has considered two geometric aspects simultaneously, the first is the distance between samples and the best approximate hyperplane and the second is the distance between the best approximate hyperplane and the obtained hyperplane. Considering the difference of granulation on kernel space and original space, an GSVM model based on kernel space is proposed by Guo et al [17], and the rules of granulation on kernel space was given through geometric analysis. However, these approaches may be not effective for some datasets, where the distance between data cannot be measured by European distance.…”
Section: Introductionmentioning
confidence: 99%