2016
DOI: 10.1016/j.automatica.2015.12.024
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of a nonsmooth optimization approach to robust estimation

Abstract: In this paper, we consider the problem of identifying a linear map from measurements which are subject to intermittent and arbitarily large errors. This is a fundamental problem in many estimation-related applications such as fault detection, state estimation in lossy networks, hybrid system identification, robust estimation, etc. The problem is hard because it exhibits some intrinsic combinatorial features. Therefore, obtaining an effective solution necessitates relaxations that are both solvable at a reasona… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
27
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
7
1

Relationship

4
4

Authors

Journals

citations
Cited by 15 publications
(29 citation statements)
references
References 42 publications
2
27
0
Order By: Relevance
“…With the help of the device of self-decomposability amplitude (12), we can state a condition for exact recovery of the parameter matrix A o by solving the optimization problem in (3). A similar result was proven in [3] for the Least Absolute Deviation (LAD) estimator.…”
Section: A Exact Recoverabilitysupporting
confidence: 73%
“…With the help of the device of self-decomposability amplitude (12), we can state a condition for exact recovery of the parameter matrix A o by solving the optimization problem in (3). A similar result was proven in [3] for the Least Absolute Deviation (LAD) estimator.…”
Section: A Exact Recoverabilitysupporting
confidence: 73%
“…Note that these problems are also of interest for the robust estimation of a single (non-hybrid) linear model in the presence of outliers, as will be considered in the examples of Sect. 5.3-5.4 (see also [4] for an analysis of the sparse optimization method of [3] in this context).…”
Section: Bounded-error Estimationmentioning
confidence: 99%
“…In this case, the problems (39) or (40) are solved only once to estimate a single model from the maximal number of points that can be considered as inliers. We compare the proposed algorithms with the standard 1 -minimization in a setting similar to the one in [4]: an increasing fraction r of 500 data points in dimension 4 are corrupted by outliers ζ i drawn from a Gaussian distribution with mean 100 and standard deviation 1000: y i = θ T x i + ξ i + ζ i . The results are reported in Fig.…”
Section: Bounded-error Estimation In the Presence Of Outliersmentioning
confidence: 99%
See 1 more Smart Citation
“…In this context, relying on suboptimal solutions can lead to highly unsatisfactory results with many misclassifications of data points. Robust methods based on convex relaxations (Liu et al, 2013;Bako, 2014;Bako and Ohlsson, 2016) or iteratively hard-thresholding (Bhatia et al, 2015) offer some guarantees but are only optimal under particular conditions on the data. Instead, in this paper, we aim at unconditional optimality and discuss the computational complexity of globally minimizing a saturated loss function for the robust estimation of linear models, let it be regression ones or subspaces.…”
Section: Introductionmentioning
confidence: 99%