2006
DOI: 10.1002/nav.20139
|View full text |Cite
|
Sign up to set email alerts
|

A mathematical programming approach for improving the robustness of least sum of absolute deviations regression

Abstract: This paper discusses a novel application of mathematical programming techniques to a regression problem. While least squares regression techniques have been used for a long time, it is known that their robustness properties are not desirable. Specifically, the estimators are known to be too sensitive to data contamination. In this paper we examine regressions based on Least-sum of Absolute Deviations (LAD) and show that the robustness of the estimator can be improved significantly through a judicious choice of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0
4

Year Published

2006
2006
2018
2018

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 15 publications
0
7
0
4
Order By: Relevance
“…Thus, the 1 estimator mitigates the effect of the largest outliers. They can also be generalized to weighted 1 regression (Giloni et al 2006a, b) by adding weights to the absolute values in the definition of the leverage constants. The use of the leverage constants to improve the RBP of weighted 1 regression is the subject of current research.…”
Section: Characterization Of the Behaviour Of The 1 -Estimatormentioning
confidence: 99%
“…Thus, the 1 estimator mitigates the effect of the largest outliers. They can also be generalized to weighted 1 regression (Giloni et al 2006a, b) by adding weights to the absolute values in the definition of the leverage constants. The use of the leverage constants to improve the RBP of weighted 1 regression is the subject of current research.…”
Section: Characterization Of the Behaviour Of The 1 -Estimatormentioning
confidence: 99%
“…Mizera and Müller (2001) and Giloni and Padberg (2004) showed that the finite sample breakdown point of LAD regression can be greater than 1/n, depending on the predictor values, with the former authors discussing how X can be chosen to increase the breakdown point of LAD. Giloni and Padberg (2004) showed that the breakdown can be calculated using a mixed integer program, and described an algorithm for solving it; Giloni, Sengupta, and Simonoff (2004) proposed an alternative algorithm similar to that of Mizera and Müller (2001) that is very efficient for large samples when p is small. Ellis and Morgenthaler (1992) appear to be the first to mention that the introduction of weights can improve the finite sample breakdown point of LAD regression (by downweighting observations that are far from the bulk of the data), but they only show this for very small data sets.…”
Section: Lad Regression and Breakdownmentioning
confidence: 99%
“…Ellis and Morgenthaler (1992) appear to be the first to mention that the introduction of weights can improve the finite sample breakdown point of LAD regression (by downweighting observations that are far from the bulk of the data), but they only show this for very small data sets. Giloni, Sengupta, and Simonoff (2004) examined this question in more detail. The weighted LAD (WLAD) regression estimator is defined to be the minimizer of…”
Section: Lad Regression and Breakdownmentioning
confidence: 99%
See 1 more Smart Citation
“…Reference [4] mainly researched on the errors between LAD and Least Square method. Reference [5] improved the robustness of LAD regression through a judicious choice of weights. Furthermore, it also provided an efficient algorithm to solve the general nonlinear, mixed integer programming problem when the number of predictors is small.…”
Section: Introductionmentioning
confidence: 99%