2020
DOI: 10.48550/arxiv.2009.08136
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multidimensional Scaling, Sammon Mapping, and Isomap: Tutorial and Survey

Benyamin Ghojogh,
Ali Ghodsi,
Fakhri Karray
et al.

Abstract: To appear as a part of an upcoming academic book on dimensionality reduction and manifold learning.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 12 publications
(18 citation statements)
references
References 31 publications
0
18
0
Order By: Relevance
“…To conclude this experimental section while opening up some perspectives, we illustrate in Figure 20 (right-hand side) the output of non-metric multidimensional scaling (MDS), using Sammon's nonlinear mapping criterion [60,61], applied to the means found by GMMSEQ with 14 clusters on campaign B. MDS allows us to visualize, in a two-dimensional space, how close the means are to each other. It is computed from the pairwise Euclidean distances between the means.…”
Section: Resultsmentioning
confidence: 99%
“…To conclude this experimental section while opening up some perspectives, we illustrate in Figure 20 (right-hand side) the output of non-metric multidimensional scaling (MDS), using Sammon's nonlinear mapping criterion [60,61], applied to the means found by GMMSEQ with 14 clusters on campaign B. MDS allows us to visualize, in a two-dimensional space, how close the means are to each other. It is computed from the pairwise Euclidean distances between the means.…”
Section: Resultsmentioning
confidence: 99%
“…Multidimensional Scaling (MDS) tries to preserve the distance after projection onto its subspace (Cox & Cox, 2008;Ghojogh et al, 2020b). We saw in Proposition 2 that metric learning can be seen as projection onto the column space of U where W = U U .…”
Section: Relevant To Multidimensional Scalingmentioning
confidence: 99%
“…where D ∈ R n×n is a matrix with squared Euclidean distance between points and R n×n H := I − (1/n)11 is the centering matrix. Isomap also applies multidimensional scaling with a geodesic kernel which uses piece-wise Euclidean distance for computing D (Tenenbaum et al, 2000;Ghojogh et al, 2020b). The row summation of this kernel matrix is (Yan et al, 2005;2006):…”
Section: Special Cases Of Graph Embeddingmentioning
confidence: 99%
“…( 6), the kernel used in kernel classical MDS or the geodesic kernel used in Isomap can be interpreted as the Laplacian matrix of graph of data as it satisfies its row-sum property. The optimization of MDS or Isomap is (Ghojogh et al, 2020b):…”
Section: Special Cases Of Graph Embeddingmentioning
confidence: 99%