2017
DOI: 10.1007/s10957-017-1203-3
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Gradient Method for Double-Convex Fractional Programming Matrix Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(8 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…For this problem, we first provide local necessary conditions and then obtain a verifiable sufficient condition for identifying global minimizer among the local ones. In this regard, our problem generalizes the cases considered in [20,23] and [2] in a considerable way.…”
Section: Introductionmentioning
confidence: 85%
See 1 more Smart Citation
“…For this problem, we first provide local necessary conditions and then obtain a verifiable sufficient condition for identifying global minimizer among the local ones. In this regard, our problem generalizes the cases considered in [20,23] and [2] in a considerable way.…”
Section: Introductionmentioning
confidence: 85%
“…In [23] a quadratic fractional optimization problem has been studied with two quadratic convex constraints using the classical Dinkelbach approach with no global convergence guarantee. In [2], using the conditional gradient method the ratio of two convex functions over a closed and convex set has been studied numerically. In this article we consider the problem MP when f j : j = 0, 1, 2, .…”
Section: Introductionmentioning
confidence: 99%
“…In this subsection, we present a noteworthy extension of the gradient descent method tailored for addressing the general tensorial convex minimization problem (1.1). The literature has seen diverse adaptations of the gradient descent technique to tackle various minimization problems, including nonlinear minimization problems [16], fractional optimization problems [4], among others. The proximal gradient method serves as a generalized version of the gradient descent method, particularly adept at handling non-differentiability in the cost function; see r2, 3, 20, 32s.…”
Section: Preliminaries and Notationmentioning
confidence: 99%
“…where Ω is a convex nonempty bounded set. As F and G are proper lower semicontinuous convex functions, F is Gateau differentiable and uniformly convex on X and if it is further assumed that Ω is closed, then, there exists a unique solution of the minimization problem (5), see [2,41,13,21,15] for a deeper discussion.…”
Section: Problemmentioning
confidence: 99%
“…fractional optimization problems [5] and others. The proximal gradient method represents a generalized form of the gradient descent method in the presence of non differentiability in the cost function [1,2,40,41].…”
Section: Tensorial Double Proximal Gradient Algorithmmentioning
confidence: 99%