2019
DOI: 10.1007/s11590-019-01432-x
|View full text |Cite
|
Sign up to set email alerts
|

Computing the resolvent of the sum of operators with application to best approximation problems

Abstract: We propose a flexible approach for computing the resolvent of the sum of weakly monotone operators in real Hilbert spaces. This relies on splitting methods where strong convergence is guaranteed. We also prove linear convergence under Lipschitz continuity assumption. The approach is then applied to computing the proximity operator of the sum of weakly convex functions, and particularly to finding the best approximation to the intersection of convex sets.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 15 publications
(22 reference statements)
0
10
0
Order By: Relevance
“…Intuitively, in the case A and B are maximally monotone, one would expect the use of equal resolvent parameters γ = δ, and in other cases, γ are δ are no longer the same. This phenomenon was initially observed in [11,12]. Although the imbalance of monotonicity can be resolved by shifting the identity between the operators as in [11,Remark 4.15], our plan is to conduct the convergence analysis of the algorithm applied to the original operators.…”
Section: The Algorithmmentioning
confidence: 98%
“…Intuitively, in the case A and B are maximally monotone, one would expect the use of equal resolvent parameters γ = δ, and in other cases, γ are δ are no longer the same. This phenomenon was initially observed in [11,12]. Although the imbalance of monotonicity can be resolved by shifting the identity between the operators as in [11,Remark 4.15], our plan is to conduct the convergence analysis of the algorithm applied to the original operators.…”
Section: The Algorithmmentioning
confidence: 98%
“…Intuitively, in the case A and B are maximally monotone, one would expect the use of equal resolvent parameters γ = δ, and in other cases, γ and δ are no longer the same. This phenomenon was initially observed in [11,12]. Although the imbalance of monotonicity can be resolved by shifting the identity between the operators as in [11,Remark 4.15], our plan is to conduct the convergence analysis of the algorithm applied to the original operators.…”
Section: The Algorithmmentioning
confidence: 98%
“…Therefore, every weak sequential cluster point of (x k ) k∈N is contained in Ω γ , and Proposition 2 implies that (x k ) k∈N is weakly convergent to a point x ∈ Ω γ . Then (15) shows that ū = J γA (x) and z = γT (ū) are the unique cluster points of (u k ) k∈N and (z k ) k∈N , respectively, and hence u k ū, v k ū and z k z. Moreover, since x was arbitrarily chosen in Ω γ , (11) and ( 12) also hold with u replaced by ū and x replaced by x.…”
Section: Davis-yin Splitting Algorithmmentioning
confidence: 99%
“…The latter inclusion is equivalent to ū = J γA (x), z = γT (ū) and ū = J γB (2ū − x − z), (15) which implies x ∈ Ω γ . Therefore, every weak sequential cluster point of (x k ) k∈N is contained in Ω γ , and Proposition 2 implies that (x k ) k∈N is weakly convergent to a point x ∈ Ω γ .…”
Section: Davis-yin Splitting Algorithmmentioning
confidence: 99%