2015
DOI: 10.1007/s10898-015-0304-5
|View full text |Cite
|
Sign up to set email alerts
|

An extension of the $$\alpha \hbox {BB}$$ α BB -type underestimation to linear parametric Hessian matrices

Abstract: The classical αBB method is a global optimization method the important step of which is to determine a convex underestimator of an objective function on an interval domain. Its particular point is to enclose the range of a Hessian matrix in an interval matrix. To have a tighter estimation of the Hessian matrices, we investigate a linear parametric form enclosure in this paper. One way to obtain this form is by using a slope extension of the Hessian entries. Numerical examples indicate that our approach can som… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 35 publications
(48 reference statements)
0
1
0
Order By: Relevance
“…Up to now, most global approximation algorithms have been developed such as convex relaxation-based method; see [5,6,7,8,9,10,11]. In particular, the αBB method, which is one of the global optimization methods and is based on the idea of the convex relaxation, plays a substantial role in the design of efficient and computationally tractable numerical algorithms for non-convex optimization problems; see, e.g., [12,13,14,15]. It is worth remarking that these algorithms generally aim at finding a single globally optimal solution, but most application problems may exist with many or even an infinite number of globally optimal solutions; see, e.g., [16,17].…”
Section: Introductionmentioning
confidence: 99%
“…Up to now, most global approximation algorithms have been developed such as convex relaxation-based method; see [5,6,7,8,9,10,11]. In particular, the αBB method, which is one of the global optimization methods and is based on the idea of the convex relaxation, plays a substantial role in the design of efficient and computationally tractable numerical algorithms for non-convex optimization problems; see, e.g., [12,13,14,15]. It is worth remarking that these algorithms generally aim at finding a single globally optimal solution, but most application problems may exist with many or even an infinite number of globally optimal solutions; see, e.g., [16,17].…”
Section: Introductionmentioning
confidence: 99%
“…So far, the majority of global approximation algorithms are designed, see, [9,10,14,20,11,16,18,19]. In particular, the αBB method has become an increasingly important global optimization method in the design of efficient and computationally tractable numerical algorithms for non-convex optimization problems, see, [3,12,6,8]. It is worth noticing that these algorithms aim in general at determining a single globally optimal solution, and the majority of application problems may be existed with many or even infinite of many globally optimal solutions.…”
Section: Introductionmentioning
confidence: 99%