optimization on manifolds
Recently Published Documents


TOTAL DOCUMENTS

20
(FIVE YEARS 6)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 17 (4) ◽  
pp. 1791-1821
Author(s):  
Pierre-Antoine Absil ◽  
Roland Herzog ◽  
Gabriele Steidl

2020 ◽  
Vol 40 (4) ◽  
pp. 2940-2940
Author(s):  
Nicolas Boumal ◽  
P-A Absil ◽  
Coralia Cartis

Abstract


2020 ◽  
Vol 39 (2-3) ◽  
pp. 303-320
Author(s):  
Michael Watterson ◽  
Sikang Liu ◽  
Ke Sun ◽  
Trey Smith ◽  
Vijay Kumar

Manifolds are used in almost all robotics applications even if they are not modeled explicitly. We propose a differential geometric approach for optimizing trajectories on a Riemannian manifold with obstacles. The optimization problem depends on a metric and collision function specific to a manifold. We then propose our safe corridor on manifolds (SCM) method of computationally optimizing trajectories for robotics applications via a constrained optimization problem. Our method does not need equality constraints, which eliminates the need to project back to a feasible manifold during optimization. We then demonstrate how this algorithm works on an example problem on [Formula: see text] and a perception-aware planning example for visual–inertially guided robots navigating in three dimensions. Formulating field of view constraints naturally results in modeling with the manifold [Formula: see text], which cannot be modeled as a Lie group. We also demonstrate the example of planning trajectories on [Formula: see text] for a formation of quadrotors within an obstacle filled environment.


Author(s):  
Michael Watterson ◽  
Sikang Liu ◽  
Ke Sun ◽  
Trey Smith ◽  
Vijay Kumar

2018 ◽  
Vol 39 (1) ◽  
pp. 1-33 ◽  
Author(s):  
Nicolas Boumal ◽  
P-A Absil ◽  
Coralia Cartis

Abstract We consider the minimization of a cost function f on a manifold $\mathcal{M}$ using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance ε. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of $\mathcal{M}$, both of these algorithms produce points with Riemannian gradient smaller than ε in $\mathcal{O}\big(1/\varepsilon ^{2}\big)$ iterations. Furthermore, RTR returns a point where also the Riemannian Hessian’s least eigenvalue is larger than −ε in $\mathcal{O} \big(1/\varepsilon ^{3}\big)$ iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy ε (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of ${\mathbb{R}^{n}}$, under simpler assumptions.


Sign in / Sign up

Export Citation Format

Share Document