XXIV International Conference of the Chilean Computer Science Society
DOI: 10.1109/qest.2004.18
|View full text |Cite
|
Sign up to set email alerts
|

Robust Neural Gas for the Analysis of Data with Outliers

Abstract: Learning the structure of real world data is difficult both to recognize and describe. The structure may contain high dimensional clusters that are related in complex ways. Furthermore, real data sets may contain several outliers.Vector quantization techniques has been successfully applied as a data mining tool. In particular the Neural Gas (NG) is a variant of the Self Organizing Map (SOM) where the neighborhoods are adaptively defined during training through the ranking order of the distance of prototypes fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…Depending on the chosen penalization term, it is possible to achieve various effects such as sparsity or grouping coefficients for redundant variables. In a general form, regularized regression solves the following optimization problem (16) where N is the number of observations, m is the number of independent variables in the matrix {x k i }, {y i } are dependent variables (to be predicted), λ is an internal parameter controlling the strength of regularization (penalty on the amplitude of regression coefficients β), and f ( z) is the regularizer function, which is f ( z) = z 2 L2 for ridge regression, f ( z) = z L1 for lasso and f ( z) = 1−α α z 2 L2 + α z L1 for elastic net methods correspondingly.…”
Section: Pqsq-based Regularized Regressionmentioning
confidence: 99%
See 2 more Smart Citations
“…Depending on the chosen penalization term, it is possible to achieve various effects such as sparsity or grouping coefficients for redundant variables. In a general form, regularized regression solves the following optimization problem (16) where N is the number of observations, m is the number of independent variables in the matrix {x k i }, {y i } are dependent variables (to be predicted), λ is an internal parameter controlling the strength of regularization (penalty on the amplitude of regression coefficients β), and f ( z) is the regularizer function, which is f ( z) = z 2 L2 for ridge regression, f ( z) = z L1 for lasso and f ( z) = 1−α α z 2 L2 + α z L1 for elastic net methods correspondingly.…”
Section: Pqsq-based Regularized Regressionmentioning
confidence: 99%
“…where a I(β j ) constant (where I index is defined from r I ≤ β j < r I+1 ) is computed accordingly to the definition of u(x) function (see ( 3)), given the estimation of β k regression coefficients at the current iteration. In practice, iterating (18) converges in a few iterations, therefore, the algorithm can work very fast and outperform the widely used least angle regression algorithm for solving (16) in case of L 1 penalties.…”
Section: Pqsq-based Regularized Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…The learning process of the NG can be seen as a parameter estimation process, and their inference relies on the data [2]. When observations substantially different from the bulk of data exist, they can influence badly the model structure bringing degradation in the estimates.…”
Section: Robust M-estimators For the Learning Processmentioning
confidence: 99%
“…In [2] and [7] the authors empirically show that the Neural Gas lacks of robustness and they incorporated several robust strategies such as outlier resistant scheme. In this paper we show that when deviations from idealized distribution function assumptions occurs, the behavior of the Neural Gas model can be drastically affected and will not preserve the topology of the feature space as desired.…”
Section: Introductionmentioning
confidence: 99%