2018
DOI: 10.1016/j.patrec.2018.08.004
|View full text |Cite
|
Sign up to set email alerts
|

On the exact minimization of saturated loss functions for robust regression and subspace estimation

Abstract: This paper deals with robust regression and subspace estimation and more precisely with the problem of minimizing a saturated loss function. In particular, we focus on computational complexity issues and show that an exact algorithm with polynomial time-complexity with respect to the number of data can be devised for robust regression and subspace estimation. This result is obtained by adopting a classification point of view and relating the problems to the search for a linear model that can approximate the ma… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 16 publications
(25 reference statements)
0
2
0
Order By: Relevance
“…The related studies and applications of truncated least squares regression, to name a few, can be found in Hinich and Talwar (1975), Yang and Ness (1995), Ikami, Yamasaki, and Aizawa (2018), Lauer (2018), and Liu and Jiang (2019).…”
Section: Motivating Scenariosmentioning
confidence: 99%
“…The related studies and applications of truncated least squares regression, to name a few, can be found in Hinich and Talwar (1975), Yang and Ness (1995), Ikami, Yamasaki, and Aizawa (2018), Lauer (2018), and Liu and Jiang (2019).…”
Section: Motivating Scenariosmentioning
confidence: 99%
“…The related studies and applications of truncated least squares regression, to name a few, can be found in [29,59,33,36,39].…”
Section: Motivating Scenariosmentioning
confidence: 99%