2008
DOI: 10.1109/icpr.2008.4761208
|View full text |Cite
|
Sign up to set email alerts
|

Signature verification based on fusion of on-line and off-line kernels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…The classification based on Support Vector Machines (SVMs) has been used widely in many pattern recognition applications as the handwritten signature verification [8], [14]. The SVM is a learning method introduced by Vapnik et al [15], which tries to find an optimal hyperplane for separating two classes.…”
Section: ) Review Of Svmsmentioning
confidence: 99%
See 1 more Smart Citation
“…The classification based on Support Vector Machines (SVMs) has been used widely in many pattern recognition applications as the handwritten signature verification [8], [14]. The SVM is a learning method introduced by Vapnik et al [15], which tries to find an optimal hyperplane for separating two classes.…”
Section: ) Review Of Svmsmentioning
confidence: 99%
“…Nakanishi et al proposed a parameter combination in Dynamic Time Warping (DTW) domain [7] for on-line signature verification. Mottl et al proposed a combination algorithm of on-line and off-line kernels [8] for signature verification using SVM. Recently, combination of off-line image and dynamic information which are obtained from the same signature [9] has been proposed that exploit global and local information.…”
Section: Introductionmentioning
confidence: 99%
“…Nakanishi et al proposed a parameter combination in Dynamic Time Warping (DWT) domain [4] for on-line signature verification. Mottl et al [5] proposed a combination algorithm of on-line and off-line kernels for signature verification using SVM. Recently, combination of off-line image and dynamic information which are obtained from the same signature [6] has been proposed that exploit global and local information.…”
Section: Introductionmentioning
confidence: 99%
“…The feature selectivity of this SVM generalisation is parametrically determined by µ : 0 ≤ µ < ∞. As µ → 0 , variances tend toward unity (10), and the RKM degenerates to the classical SVM (2). Contrarily, when µ → ∞, we have from (6) that n i=1 (1/r i )a 2 i +(1+µ) ln r i + C N j=1 δ j → min; actually a significantly more selective training criterion than the original RKM (without supervised selectivity):…”
Section: The Continuous Training Technique With Supervised Selectivitymentioning
confidence: 99%
“…In the following paper, we show, following [9] and [10], how selectivity may be incorporated into the Relevance Kernel Machine (RKM) [4,5], a continuous wrapper FS method previously described by the authors. The desired selectivity is achieved through a meta-parameter that controls the tendency of the RKM to generate zero components in the orientation of the decision plane (and hence the degree of elimination of constituent kernels).…”
Section: Introductionmentioning
confidence: 99%