2010
DOI: 10.1016/j.neucom.2009.11.017
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive local dissimilarity measures for discriminative dimension reduction of labeled data

Abstract: Due to the tremendous increase of electronic information with respect to the size of data sets as well as their dimension, dimension reduction and visualization of high-dimensional data has become one of the key problems of data mining. Since embedding in lower dimensions necessarily includes a loss of information, methods to explicitly control the information kept by a specific dimension reduction technique are highly desirable. The incorporation of supervised class information constitutes an important specif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
22
0
1

Year Published

2010
2010
2016
2016

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 43 publications
(23 citation statements)
references
References 29 publications
0
22
0
1
Order By: Relevance
“…The concept of LiRaM LVQ can also be expanded to the use of localized rectangular matrices, representing several local linear projections. The global combination of these local linear patches by means of charting is discussed in Brand (2003), and Bunte, Hammer, Wismüller, and Biehl (2010b).…”
Section: Liram Lvq With Localized Dissimilaritiesmentioning
confidence: 99%
See 3 more Smart Citations
“…The concept of LiRaM LVQ can also be expanded to the use of localized rectangular matrices, representing several local linear projections. The global combination of these local linear patches by means of charting is discussed in Brand (2003), and Bunte, Hammer, Wismüller, and Biehl (2010b).…”
Section: Liram Lvq With Localized Dissimilaritiesmentioning
confidence: 99%
“…These algorithms have been employed successfully in a variety of scientific and commercial applications, including image analysis, bioinformatics, robotics, etc. (Biehl, Ghosh, & Hammer, 2007;Bojer, Hammer, Schunk, & von Toschanowitz, 2001;Bunte, Biehl, Petkov, & Jonkman, 2009;Bunte, Hammer, Schneider, & Biehl, 2009;Bunte, Hammer, Wismüller, & Biehl, 2010a;Hammer, Strickert, & Villmann, 2005a;Hammer & Villmann, 2002;Villmann, Merenyi, & Hammer, 2003). The method is easy to implement and its complexity is controlled by the user in a straightforward way.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Detailed information about the algorithm, parameters, running time and complexity can be found, for example in [11,2,30,7].…”
Section: Feature Transformation Obtained By Liram Lvqmentioning
confidence: 99%