2016
DOI: 10.1016/j.neunet.2016.08.007
|View full text |Cite
|
Sign up to set email alerts
|

Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning

Abstract: Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 33 publications
(61 reference statements)
0
4
0
1
Order By: Relevance
“…The approach is similar to the ‘data-driven’ trimmed k -means clustering [ 50 ]. Alternative ways of constructing robust principal graphs include using piece-wise quadratic subquadratic error functions (PQSQ potentials) [ 51 ], which uses computationally efficient non-quadratic error functions for approximating a dataset. These two approaches will be implemented in the future versions of ElPiGraph.…”
Section: Methodsmentioning
confidence: 99%
“…The approach is similar to the ‘data-driven’ trimmed k -means clustering [ 50 ]. Alternative ways of constructing robust principal graphs include using piece-wise quadratic subquadratic error functions (PQSQ potentials) [ 51 ], which uses computationally efficient non-quadratic error functions for approximating a dataset. These two approaches will be implemented in the future versions of ElPiGraph.…”
Section: Methodsmentioning
confidence: 99%
“…Авторская идея заключается в представлении базы цифровых данных в виде визуализированной компьютерной модели с реализацией возможности выявления неформализованного признака между навигационными параметрами и синтезированного на их матричной основе объемного геометрического образа [21]. Данный подход позволяет получить компромиссное решение при невозможности формулирования формальных критериев нечеткого поиска ошибочности [22], [23].…”
Section: обсуждение (Discussion)unclassified
“…[25,26]. Even non-convex quasinorms and their tropical approximations are used efficiently to provide sparse and robust learning results [32]. Vapnik [26] defined a formalized fragment of machine learning using minimization of a risk functional that is the mathematical expectation of a general loss function.…”
Section: F I Tyutchev English Translation By F Judementioning
confidence: 99%