2016 International Joint Conference on Neural Networks (IJCNN) 2016
DOI: 10.1109/ijcnn.2016.7727534
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive tangent distances in generalized learning vector quantization for transformation and distortion invariant classification learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 21 publications
0
12
0
Order By: Relevance
“…This was also shown in [10], where the accuracy of the Madry model is significantly lower when considering a threshold t ∞ > 0.3. 9 Furthermore, the Madry model has outstanding robustness scores for gradient-based attacks in general. We accredit this effect to potential obfuscation of gradients as a side-effect of the adversarial training procedure.…”
Section: Hypothesis Margin Maximization In the Input Space Produces Rmentioning
confidence: 99%
See 1 more Smart Citation
“…This was also shown in [10], where the accuracy of the Madry model is significantly lower when considering a threshold t ∞ > 0.3. 9 Furthermore, the Madry model has outstanding robustness scores for gradient-based attacks in general. We accredit this effect to potential obfuscation of gradients as a side-effect of the adversarial training procedure.…”
Section: Hypothesis Margin Maximization In the Input Space Produces Rmentioning
confidence: 99%
“…For the Generalized LVQ (GLVQ) [6], considered as a differentiable cost function based variant of LVQ, robustness is theoretically anticipated because it maximizes the hypothesis margin in the input space [7]. This changes if the squared Euclidean distance in GLVQ is replaced by adaptive dissimilarity measures such as in Generalized Matrix LVQ (GMLVQ) [8] or Generalized Tangent LVQ (GTLVQ) [9]. They first apply a projection and measure the dissimilarity in the corresponding projection space, also denoted as feature space.…”
Section: Introductionmentioning
confidence: 99%
“…Recent developments include tangent metrics to deal with invariances of data regarding classification, e.g. rotations of objects in images [51,52].…”
Section: Beyond the Euclidean World -Glvq With Non-standard Dissimilamentioning
confidence: 99%
“…Recently, GLVQ was extended also to deal with this problem. More specifically, GLVQ is provided wih an adaptive tangent metric (GTLVQ) allowing to determine the respective invariances/transformations automatically during classification learning following the same principle as relevance learning [51,52].…”
Section: Relevance Learning and Related Variantsmentioning
confidence: 99%
“…The Generalized Tangent Learning Vector Quantization (GTLVQ) [7] models the classification boundary implicitly by approximating the local data manifold structure via affine subspaces (tangent space approximations). Employing the GTLVQ as domain tangent discriminator is favorable to address the above issues because (i) provides a locally invariant and reliable model in the adaptation process by subspace and online learning, (ii) can capture multi-modal manifold structures, and (iii) provides an interpretable model by visualizing points from the affine subspace to verify the adaptation process.…”
Section: Introductionmentioning
confidence: 99%