2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.00772
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Optimize on SPD Manifolds

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
36
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(36 citation statements)
references
References 27 publications
0
36
0
Order By: Relevance
“…When this is not possible, a budding area of research investigates more directly including the manifold structure into the amortization process. Gao et al (2020) amortize optimization problems over SPD spaces.…”
Section: Euclidean → Non-euclidean Optimizationmentioning
confidence: 99%
“…When this is not possible, a budding area of research investigates more directly including the manifold structure into the amortization process. Gao et al (2020) amortize optimization problems over SPD spaces.…”
Section: Euclidean → Non-euclidean Optimizationmentioning
confidence: 99%
“…Figure 1: We measure the time and memory consumption of the Riemannian meta-optimization method (Gao et al 2020) and our method. The method (Gao et al 2020) has heavy computational and memory burdens with the increase of the number of steps in the inner-loop. In contrast, our method reduces computational cost and memory footprint significantly.…”
Section: Legendmentioning
confidence: 99%
“…In contrast, our method reduces computational cost and memory footprint significantly. This is because the method (Gao et al 2020) differentiates through the whole inner-loop optimization to compute the meta-gradient, which involves a series of time-consumption derivatives and stores a large computation graph, while our meta-gradient computation is only related to the final two iterations.…”
Section: Legendmentioning
confidence: 99%
See 2 more Smart Citations