2022
DOI: 10.1007/s10589-022-00378-8
|View full text |Cite
|
Sign up to set email alerts
|

An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function

Abstract: In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 9 publications
0
1
0
Order By: Relevance
“…However, the convergence of GPDPS depends on both Lipschitz gradients and bounded gradient of K. Recently, Hamedani [9] proposed a primal-dual algorithm for problem (1.1) and achieved an ergodic convergence rate of function value with O(1/k). To deal with the nonsmooth term in coupling function K(x, y), Bot ¸et al [3] designed an optimistic gradient ascent-proximal point algorithm and obtained a convergence rate of order O(1/K) for convex-concave saddle point problem. Distinct from the above research, in this paper, we build a semi-proximal alternating coordinate method and the convergence of iteration (x k , y k ) only depends on Lipschitz gradients of K. Moreover, we establish the linear convergence rate of (x k , y k ) provided with local metric subregularity, which, as far as we know, is not provided in other work for the general problem (1.1).…”
Section: Problem Settingmentioning
confidence: 99%
“…However, the convergence of GPDPS depends on both Lipschitz gradients and bounded gradient of K. Recently, Hamedani [9] proposed a primal-dual algorithm for problem (1.1) and achieved an ergodic convergence rate of function value with O(1/k). To deal with the nonsmooth term in coupling function K(x, y), Bot ¸et al [3] designed an optimistic gradient ascent-proximal point algorithm and obtained a convergence rate of order O(1/K) for convex-concave saddle point problem. Distinct from the above research, in this paper, we build a semi-proximal alternating coordinate method and the convergence of iteration (x k , y k ) only depends on Lipschitz gradients of K. Moreover, we establish the linear convergence rate of (x k , y k ) provided with local metric subregularity, which, as far as we know, is not provided in other work for the general problem (1.1).…”
Section: Problem Settingmentioning
confidence: 99%
“…This choice is usually inspired by a primal-dual problem, a minmax problem where the objective function is the Lagrangian of a convex optimization problem of the form min xoeX , Ax=b Ï(x), where y represents the dual variable. In general, saddlepoint problems as (3.14) (also called minmax problems) have a large presence in the literature with several variations on convexity or smoothness of functions Ï, Â (see [38,8]). These problems also pop-up in various applications, like image processing [8], resource allocation [20].…”
Section: Theorem 36 (Bilinear Saddle-point Representation) the Mmk Pr...mentioning
confidence: 99%
“…In general, saddlepoint problems as (3.14) (also called minmax problems) have a large presence in the literature with several variations on convexity or smoothness of functions Ï, Â (see [38,8]). These problems also pop-up in various applications, like image processing [8], resource allocation [20].…”
Section: Theorem 36 (Bilinear Saddle-point Representation) the Mmk Pr...mentioning
confidence: 99%