2011
DOI: 10.1198/tech.2010.10111
|View full text |Cite
|
Sign up to set email alerts
|

Nearly-Isotonic Regression

Abstract: We consider the problem of approximating a sequence of data points with a "nearly-isotonic", or nearly-monotone function. This is formulated as a convex optimization problem that yields a family of solutions, with one extreme member being the standard isotonic regression fit. We devise a simple algorithm to solve for the path of solutions, which can be viewed as a modified version of the well-known pool adjacent violators algorithm, and computes the entire path in O(n log n) operations, (n being the number of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
109
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 101 publications
(110 citation statements)
references
References 18 publications
1
109
0
Order By: Relevance
“…where λ is the tuning parameter on the sum of absolute values for the coe cients |β j | (Tibshirani, 1996). Larger values for λ provide more regularization, whereas λ = 0 results in a nonpenalized model.…”
Section: Gaussian Graphical Modelmentioning
confidence: 99%
“…where λ is the tuning parameter on the sum of absolute values for the coe cients |β j | (Tibshirani, 1996). Larger values for λ provide more regularization, whereas λ = 0 results in a nonpenalized model.…”
Section: Gaussian Graphical Modelmentioning
confidence: 99%
“…By recognizing the constraints as block-simplex constraints, we apply a standard equality constraint elimination technique (Boyd and Vandenberghe, 2004, Section 4.2.4) with a particular change of variables to convert the non-negativity constraints on the variables into ordering constraints. In the new space induced by the change of variables, we show that the projection on the feasible set (characterized by the ordering constraints) can be performed in linear time via bounded isotonic regression (see (Tibshirani et al, 2011) for a short survey on isotonic regression), where n is the number of routes per OD pair. This is an improvement over the O n log n ð Þtime required by the projection onto the simplex Wang and Carreira-Perpin, 2013).…”
Section: Contributions Of This Articlementioning
confidence: 99%
“…Consider the special case of the problem (8) such that the expert estimations of the objects y 0 are linearly scaled and the expert estimations of the criteria weights are ordinalscaled. In this case, the problem can be formulated in the terms of the wellknown isotonic regression problem [15,16]. Let w = X + y 0 .…”
Section: The Algorithm Of Minimizing Distance Between Vectors In Conesmentioning
confidence: 99%