SEARCH

SEARCH BY CITATION

Keywords:

  • nonlinear optimization;
  • gradient preconditioning;
  • geometric flow;
  • G.1.5: Roots of Nonlinear Equations;
  • G.1.6: Optimization;
  • I.3.5: Computational Geometry and Object Modeling

Abstract

We present a method for accelerating the convergence of continuous non-linear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyse this method from a signal processing viewpoint. This analysis reveals that we can construct various filters using the Laplace–Beltrami operator of the shape that can effectively separate the components of the gradient at different scales. We use this idea to adaptively change the scale of features being optimized to arrive at a solution that is optimal across multiple scales. This is in contrast to traditional descent-based methods, for which the rate of convergence often stalls early once the high frequency components have been optimized. We demonstrate how our method can be easily integrated into existing non-linear optimization frameworks such as gradient descent, Broyden–Fletcher–Goldfarb–Shanno (BFGS) and the non-linear conjugate gradient method. We show significant performance improvement for shape optimization in variational shape modelling and parameterization, and we also demonstrate the use of our method for efficient physical simulation.