1952
DOI: 10.1007/bf02288916
|View full text |Cite
|
Sign up to set email alerts
|

Multidimensional scaling: I. Theory and method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
1,038
0
40

Year Published

1996
1996
2016
2016

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 1,806 publications
(1,113 citation statements)
references
References 4 publications
5
1,038
0
40
Order By: Relevance
“…combining those dimensions that are redundant in the restraints space. In the most simple implementation of the MDS algorithm, distance would be the minimum number of links between two nodes [26,27]. In contrast, here we give distances the meaning of a travelling time, so the MDS algorithm will lump together those distances that correspond to fast interconversions (i.e.…”
Section: The Conformation Space Network Of Watermentioning
confidence: 99%
“…combining those dimensions that are redundant in the restraints space. In the most simple implementation of the MDS algorithm, distance would be the minimum number of links between two nodes [26,27]. In contrast, here we give distances the meaning of a travelling time, so the MDS algorithm will lump together those distances that correspond to fast interconversions (i.e.…”
Section: The Conformation Space Network Of Watermentioning
confidence: 99%
“…Early research into visual exploration of data led to approaches such as multidimensional scaling [21,11] and projection pursuit [6,9]. Most recent research on this topic (also referred to as manifold learning) is still inspired by the aim of multi-dimensional scaling; find a low-dimensional embedding of points such that their distances in the high-dimensional space are 9 The R implementation used to produce Tab.…”
Section: Related Workmentioning
confidence: 99%
“…The foundational ideas behind multidimensional scaling were first proposed by Young and Householder [25], then further developed by Torgerson [24] and given the name of MDS. Considerable research has gone into devising faster and more robust solutions.…”
Section: Previous Workmentioning
confidence: 99%
“…These spectral methods find embedding coordinates by computing the top eigenvectors of a "double-centered" transformation of the distance matrix sorted by decreasing eigenvalue. The original algorithm, Classic MDS [24], [25] computed a costly O(N 3 ) singular value decomposition of this matrix. Modern classical scaling methods quickly estimate the eigenvectors using the power method or other more sophisticated iterative methods that employ O(N 2 ) matrix-vector products.…”
Section: A Classical Scalingmentioning
confidence: 99%