scholarly journals An efficient computational procedure for solving entropy optimization problems with infinitely many linear constraints

1996 ◽  
Vol 72 (1) ◽  
pp. 127-139 ◽  
Author(s):  
Shu-Cherng Fang ◽  
H.-S. Jacob Tsao
Author(s):  
S Yoo ◽  
C-G Park ◽  
S-H You ◽  
B Lim

This article presents a new methodology to generate optimal trajectories in controlling an automated excavator. By parameterizing all the actuator displacements with B-splines of the same order and with the same number of control points, the coupled actuator limits, associated with the maximum pump flowrate, are described as the finite-dimensional set of linear constraints to the motion optimization problem. Several weighting functions are introduced on the generalized actuator torque so that the solution to each optimization problems contains the physical meaning. Numerical results showing that the generated motions of the excavator are fairly smooth and effectively save energy, which can prevent mechanical wearing and possibly save fuel consumption, are presented. A typical operator's manoeuvre from experiments is referred to bring out the standing features of the optimized motion.


Author(s):  
T. E. Potter ◽  
K. D. Willmert ◽  
M. Sathyamoorthy

Abstract Mechanism path generation problems which use link deformations to improve the design lead to optimization problems involving a nonlinear sum-of-squares objective function subjected to a set of linear and nonlinear constraints. Inclusion of the deformation analysis causes the objective function evaluation to be computationally expensive. An optimization method is presented which requires relatively few objective function evaluations. The algorithm, based on the Gauss method for unconstrained problems, is developed as an extension of the Gauss constrained technique for linear constraints and revises the Gauss nonlinearly constrained method for quadratic constraints. The derivation of the algorithm, using a Lagrange multiplier approach, is based on the Kuhn-Tucker conditions so that when the iteration process terminates, these conditions are automatically satisfied. Although the technique was developed for mechanism problems, it is applicable to any optimization problem having the form of a sum of squares objective function subjected to nonlinear constraints.


2021 ◽  
Vol 78 (1) ◽  
pp. 139-156
Author(s):  
Antonio Boccuto

Abstract We give some versions of Hahn-Banach, sandwich, duality, Moreau--Rockafellar-type theorems, optimality conditions and a formula for the subdifferential of composite functions for order continuous vector lattice-valued operators, invariant or equivariant with respect to a fixed group G of homomorphisms. As applications to optimization problems with both convex and linear constraints, we present some Farkas and Kuhn-Tucker-type results.


Author(s):  
Ion Necoara ◽  
Martin Takáč

Abstract In this paper we consider large-scale smooth optimization problems with multiple linear coupled constraints. Due to the non-separability of the constraints, arbitrary random sketching would not be guaranteed to work. Thus, we first investigate necessary and sufficient conditions for the sketch sampling to have well-defined algorithms. Based on these sampling conditions we develop new sketch descent methods for solving general smooth linearly constrained problems, in particular, random sketch descent (RSD) and accelerated random sketch descent (A-RSD) methods. To our knowledge, this is the first convergence analysis of RSD algorithms for optimization problems with multiple non-separable linear constraints. For the general case, when the objective function is smooth and non-convex, we prove for the non-accelerated variant sublinear rate in expectation for an appropriate optimality measure. In the smooth convex case, we derive for both algorithms, non-accelerated and A-RSD, sublinear convergence rates in the expected values of the objective function. Additionally, if the objective function satisfies a strong convexity type condition, both algorithms converge linearly in expectation. In special cases, where complexity bounds are known for some particular sketching algorithms, such as coordinate descent methods for optimization problems with a single linear coupled constraint, our theory recovers the best known bounds. Finally, we present several numerical examples to illustrate the performances of our new algorithms.


Author(s):  
Abdelkrim El Mouatasim ◽  
Rachid Ellaia ◽  
Eduardo de Cursi

Random perturbation of the projected variable metric method for nonsmooth nonconvex optimization problems with linear constraintsWe present a random perturbation of the projected variable metric method for solving linearly constrained nonsmooth (i.e., nondifferentiable) nonconvex optimization problems, and we establish the convergence to a global minimum for a locally Lipschitz continuous objective function which may be nondifferentiable on a countable set of points. Numerical results show the effectiveness of the proposed approach.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Sha Lu ◽  
Zengxin Wei

Proximal point algorithm is a type of method widely used in solving optimization problems and some practical problems such as machine learning in recent years. In this paper, a framework of accelerated proximal point algorithm is presented for convex minimization with linear constraints. The algorithm can be seen as an extension to G u ¨ ler’s methods for unconstrained optimization and linear programming problems. We prove that the sequence generated by the algorithm converges to a KKT solution of the original problem under appropriate conditions with the convergence rate of O 1 / k 2 .


Sign in / Sign up

Export Citation Format

Share Document