2019
DOI: 10.1145/3306346.3322953
|View full text |Cite
|
Sign up to set email alerts
|

Spectral coarsening of geometric operators

Abstract: for sharing implementations and results; Zih-Yin Chen for early discussions.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
45
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(45 citation statements)
references
References 69 publications
(74 reference statements)
0
45
0
Order By: Relevance
“…The goal of spectral coarsening is to reduce the size of a discrete operator, derived from a 3D shape, while preserving its spectral properties. Liu et al [2019] show that it is possible to have a significant reduction without affecting the low-frequency eigenvectors and eigenvalues. They visualize the preservation of spectral properties with the inner product matrix between eigenvectors (see Fig.…”
Section: Methodsmentioning
confidence: 93%
See 3 more Smart Citations
“…The goal of spectral coarsening is to reduce the size of a discrete operator, derived from a 3D shape, while preserving its spectral properties. Liu et al [2019] show that it is possible to have a significant reduction without affecting the low-frequency eigenvectors and eigenvalues. They visualize the preservation of spectral properties with the inner product matrix between eigenvectors (see Fig.…”
Section: Methodsmentioning
confidence: 93%
“…However, their method only supports matrices with a chordal sparsity pattern already, which is not applicable to our problem because most discrete operators are not chordal. In contrast, we utilize the ideas from [Sun and Vandenberghe 2015] to handle any sparsity pattern of choice, and the strategies in [Zheng et al 2017b[Zheng et al , 2020 to develop a chordal ADMM solver for the spectral coarsening energy [Liu et al 2019]. We exploit the fact that many discrete operators are sparse and symmetric to perform a change of variables to significantly reduce the computational cost.…”
Section: Chordal Graphs In Sparse Matrix Optimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…Schemes for the approximate solution of eigenproblems are static condensation [Bathe 2014] in engineering and the Nyström method [Williams and Seeger 2001] and random projections [Halko et al 2011] in machine learning. Approximation schemes for the Laplace-Beltrami eigenproblem on surfaces have been introduced in Chuang et al [2009], Lescoat et al [2020], Liu et al [2019], and Nasikun et al [2018]. In contrast to the eigensolvers we consider in this work, these schemes do not provide any guarantee on the approximation quality of the eigenpairs.…”
Section: Related Workmentioning
confidence: 99%