2018
DOI: 10.3389/fams.2018.00059
|View full text |Cite
|
Sign up to set email alerts
|

A Variational Model for Data Fitting on Manifolds by Minimizing the Acceleration of a Bézier Curve

Abstract: We derive a variational model to fit a composite Bézier curve to a set of data points on a Riemannian manifold. The resulting curve is obtained in such a way that its mean squared acceleration is minimal in addition to remaining close the data points. We approximate the acceleration by discretizing the squared second order derivative along the curve. We derive a closed-form, numerically stable and efficient algorithm to compute the gradient of a Bézier curve on manifolds with respect to its control points, exp… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…Another example are fixed rank matrices, appearing in matrix completion (Vandereycken, 2013). Working on these data, for example doing data interpolation and approximation (Bergmann & Gousenbourger, 2018), denoising Lellmann et al, 2013), inpainting (Bergmann, Chan, et al, 2016), or performing matrix completion (Gao & Absil, 2021), can usually be phrased as an optimization problem Minimize f (x) where x ∈ M, where the optimization problem is phrased on a Riemannian manifold M.…”
Section: Statement Of Needmentioning
confidence: 99%
See 1 more Smart Citation
“…Another example are fixed rank matrices, appearing in matrix completion (Vandereycken, 2013). Working on these data, for example doing data interpolation and approximation (Bergmann & Gousenbourger, 2018), denoising Lellmann et al, 2013), inpainting (Bergmann, Chan, et al, 2016), or performing matrix completion (Gao & Absil, 2021), can usually be phrased as an optimization problem Minimize f (x) where x ∈ M, where the optimization problem is phrased on a Riemannian manifold M.…”
Section: Statement Of Needmentioning
confidence: 99%
“…Based on this theory and algorithm, a higher-order algorithm was introduced in Diepeveen & Lellmann (2021). Optimized examples from Bergmann & Gousenbourger (2018) performing data interpolation and approximation with manifold-valued Bézier curves are also included in Manopt.jl.…”
Section: Related Research and Softwarementioning
confidence: 99%
“…With (D∇)( • )[ • ]: T M d 1 ×d 2 → T N given by Jacobi fields, its adjoint can be computed using the so-called adjoint Jacobi fields, see e. g., [11,Sect. 4.2].…”
Section: Rof Models On Manifoldsmentioning
confidence: 99%
“…its starting point p in direction X ∈ T p M is given by J X , i.e., d p γ(t; •, q)(X) = J X (t) for all t ∈ [0, 1]. Furthermore, since γ(t; p, q) = γ(1 − t; q, p), endpoint variations are given analogously [6,Sec. 3.1].…”
Section: Spline Regressionmentioning
confidence: 99%
“…Therefore, the introduction of flexible, intrinsic splines is a key contribution of this work. Our model features closed-form, numerically stable and efficient expressions for the gradient of the regression objective in terms of concatenated adjoint Jacobi fields [6]. In particular, we derive an algorithm that only requires basic Riemannian operations: the exponential and logarithmic map as well as certain Jacobi fields.…”
Section: Introductionmentioning
confidence: 99%