scholarly journals Inexact descent methods for convex minimization problems in Banach spaces

2020 ◽  
Vol 36 (1) ◽  
pp. 141-146
Author(s):  
SIMEON REICH ◽  
ALEXANDER J. ZASLAVSKI

"Given a Lipschitz and convex objective function of an unconstrained optimization problem, defined on a Banach space, we revisit the class of regular vector fields which was introduced in our previous work on descent methods. We study, in particular, the asymptotic behavior of the sequence of values of the objective function for a certain inexact process generated by a regular vector field when the sequence of computational errors converges to zero and show that this sequence of values converges to the infimum of the given objective function of the unconstrained optimization problem."

2020 ◽  
Vol 34 (09) ◽  
pp. 13620-13621
Author(s):  
Sören Laue ◽  
Matthias Mitterreiter ◽  
Joachim Giesen

Most problems from classical machine learning can be cast as an optimization problem. We introduce GENO (GENeric Optimization), a framework that lets the user specify a constrained or unconstrained optimization problem in an easy-to-read modeling language. GENO then generates a solver, i.e., Python code, that can solve this class of optimization problems. The generated solver is usually as fast as hand-written, problem-specific, and well-engineered solvers. Often the solvers generated by GENO are faster by a large margin compared to recently developed solvers that are tailored to a specific problem class.An online interface to our framework can be found at http://www.geno-project.org.


Author(s):  
K. J. KACHIASHVILI

There are different methods of statistical hypotheses testing.1–4 Among them, is Bayesian approach. A generalization of Bayesian rule of many hypotheses testing is given below. It consists of decision rule dimensionality with respect to the number of tested hypotheses, which allows to make decisions more differentiated than in the classical case and to state, instead of unconstrained optimization problem, constrained one that enables to make guaranteed decisions concerning errors of true decisions rejection, which is the key point when solving a number of practical problems. These generalizations are given both for a set of simple hypotheses, each containing one space point, and hypotheses containing a finite set of separated space points.


2019 ◽  
Vol 16 (3) ◽  
pp. 0661
Author(s):  
Mahmood Et al.

Broyden update is one of the one-rank updates which solves the unconstrained optimization problem but this update does not guarantee the positive definite and the symmetric property of Hessian matrix. In this paper the guarantee of positive definite and symmetric property for the Hessian matrix will be established by updating the vector  which represents the difference between the next gradient and the current gradient of the objective function assumed to be twice continuous and differentiable .Numerical results are reported to compare the proposed method with the Broyden method under standard problems.


Sign in / Sign up

Export Citation Format

Share Document