Implementing Sparse Estimation: Cyclic Coordinate Descent vs Linearized Bregman Iterations

Author(s):  
Yuneisy Garcia Guzman ◽  
Michael Lunglmayr
2019 ◽  
Vol 9 (24) ◽  
pp. 5461
Author(s):  
Yuhan Chen ◽  
Xiao Luo ◽  
Baoling Han ◽  
Yan Jia ◽  
Guanhao Liang ◽  
...  

The inverse kinematics of robot manipulators is a crucial problem with respect to automatically controlling robots. In this work, a Newton-improved cyclic coordinate descent (NICCD) method is proposed, which is suitable for robots with revolute or prismatic joints with degrees of freedom of any arbitrary number. Firstly, the inverse kinematics problem is transformed into the objective function optimization problem, which is based on the least-squares form of the angle error and the position error expressed by the product-of-exponentials formula. Thereafter, the optimization problem is solved by combining Newton’s method with the improved cyclic coordinate descent (ICCD) method. The difference between the proposed ICCD method and the traditional cyclic coordinate descent method is that consecutive prismatic joints and consecutive parallel revolute joints are treated as a whole in the former for the purposes of optimization. The ICCD algorithm has a convenient iterative formula for these two cases. In order to illustrate the performance of the NICCD method, its simulation results are compared with the well-known Newton–Raphson method using six different robot manipulators. The results suggest that, overall, the NICCD method is effective, accurate, robust, and generalizable. Moreover, it has advantages for the inverse kinematics calculations of continuous trajectories.


2018 ◽  
Vol 39 (3) ◽  
pp. 1246-1275 ◽  
Author(s):  
Ching-pei Lee ◽  
Stephen J Wright

Abstract Variants of the coordinate descent approach for minimizing a nonlinear function are distinguished in part by the order in which coordinates are considered for relaxation. Three common orderings are cyclic (CCD), in which we cycle through the components of $x$ in order; randomized (RCD), in which the component to update is selected randomly and independently at each iteration; and random-permutations cyclic (RPCD), which differs from CCD only in that a random permutation is applied to the variables at the start of each cycle. Known convergence guarantees are weaker for CCD and RPCD than for RCD, though in most practical cases, computational performance is similar among all these variants. There is a certain type of quadratic function for which CCD is significantly slower than for RCD; a recent paper by Sun & Ye (2016, Worst-case complexity of cyclic coordinate descent: $O(n^2)$ gap with randomized version. Technical Report. Stanford, CA: Department of Management Science and Engineering, Stanford University. arXiv:1604.07130) has explored the poor behavior of CCD on functions of this type. The RPCD approach performs well on these functions, even better than RCD in a certain regime. This paper explains the good behavior of RPCD with a tight analysis.


2011 ◽  
Vol 59 (2) ◽  
pp. 227-247 ◽  
Author(s):  
Luis Bayón ◽  
Jose M. Grau ◽  
Maria M. Ruiz ◽  
Pedro M. Suárez

Sign in / Sign up

Export Citation Format

Share Document