scholarly journals A Geometric Integration Approach to Nonsmooth, Nonconvex Optimisation

Author(s):  
Erlend S. Riis ◽  
Matthias J. Ehrhardt ◽  
G. R. W. Quispel ◽  
Carola-Bibiane Schönlieb

AbstractThe optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that is frequently encountered, for example in model parameter optimisation problems. Bilevel optimisation of parameters is a standard setting in areas such as variational regularisation problems and supervised machine learning. We present efficient and robust derivative-free methods called randomised Itoh–Abe methods. These are generalisations of the Itoh–Abe discrete gradient method, a well-known scheme from geometric integration, which has previously only been considered in the smooth setting. We demonstrate that the method and its favourable energy dissipation properties are well defined in the nonsmooth setting. Furthermore, we prove that whenever the objective function is locally Lipschitz continuous, the iterates almost surely converge to a connected set of Clarke stationary points. We present an implementation of the methods, and apply it to various test problems. The numerical results indicate that the randomised Itoh–Abe methods can be superior to state-of-the-art derivative-free optimisation methods in solving nonsmooth problems while still remaining competitive in terms of efficiency.

Author(s):  
Auwal Bala Abubakar ◽  
Kanikar Muangchoo ◽  
Abdulkarim Hassan Ibrahim ◽  
Jamilu Abubakar ◽  
Sadiya Ali Rano

AbstractThis paper focuses on the problem of convex constraint nonlinear equations involving monotone operators in Euclidean space. A Fletcher and Reeves type derivative-free conjugate gradient method is proposed. The proposed method is designed to ensure the descent property of the search direction at each iteration. Furthermore, the convergence of the proposed method is proved under the assumption that the underlying operator is monotone and Lipschitz continuous. The numerical results show that the method is efficient for the given test problems.


2019 ◽  
Vol 28 (1) ◽  
pp. 19-26
Author(s):  
IOANNIS K. ARGYROS ◽  
◽  
SANTHOSH GEORGE ◽  

We present the local as well as the semi-local convergence of some iterative methods free of derivatives for Banach space valued operators. These methods contain the secant and the Kurchatov method as special cases. The convergence is based on weak hypotheses specializing to Lipschitz continuous or Holder continuous hypotheses. The results are of theoretical and practical interest. In particular the method is compared favorably ¨ to other methods using concrete numerical examples to solve systems of equations containing a nondifferentiable term.


2011 ◽  
Vol 52-54 ◽  
pp. 926-931
Author(s):  
Qing Hua Zhou ◽  
Feng Xia Xu ◽  
Yan Geng ◽  
Ya Rui Zhang

Wedge trust region method based on traditional trust region is designed for derivative free optimization problems. This method adds a constraint to the trust region problem, which is called “wedge method”. The problem is that the updating strategy of wedge trust region radius is somewhat simple. In this paper, we develop and combine a new radius updating rule with this method. For most test problems, the number of function evaluations is reduced significantly. The experiments demonstrate the effectiveness of the improvement through our algorithm.


Author(s):  
Kashif Memon

In this research paper, a new derivative-free Simpson 1/3-type quadrature scheme has been proposed for the approximation of the Riemann-Stieltjes integral (RSI). The composite form of the proposed scheme on the RSI has been derived using the concept of precision. The theorems concerning basic form, composite form, local and global errors of the new scheme have been proved theoretically. For the trivial case of the integrator in the proposed RS scheme, successful reduction to the corresponding Riemann scheme is proved. The performance of the proposed scheme has been tested by numerical experiments using MATLAB on some test problems of RS integrals from literature against some existing schemes. The computational cost, the order of accuracy and average CPU times (in seconds) of the discussed rules have been computed to demonstrate cost-effectiveness, time-efficiency and rapid convergence of the proposed scheme under similar conditions.


2021 ◽  
Vol 78 (3) ◽  
pp. 705-740
Author(s):  
Caroline Geiersbach ◽  
Teresa Scarinci

AbstractFor finite-dimensional problems, stochastic approximation methods have long been used to solve stochastic optimization problems. Their application to infinite-dimensional problems is less understood, particularly for nonconvex objectives. This paper presents convergence results for the stochastic proximal gradient method applied to Hilbert spaces, motivated by optimization problems with partial differential equation (PDE) constraints with random inputs and coefficients. We study stochastic algorithms for nonconvex and nonsmooth problems, where the nonsmooth part is convex and the nonconvex part is the expectation, which is assumed to have a Lipschitz continuous gradient. The optimization variable is an element of a Hilbert space. We show almost sure convergence of strong limit points of the random sequence generated by the algorithm to stationary points. We demonstrate the stochastic proximal gradient algorithm on a tracking-type functional with a $$L^1$$ L 1 -penalty term constrained by a semilinear PDE and box constraints, where input terms and coefficients are subject to uncertainty. We verify conditions for ensuring convergence of the algorithm and show a simulation.


2007 ◽  
Vol 12 (3) ◽  
pp. 277-289 ◽  
Author(s):  
Milda Baravykaitė ◽  
Raimondas Čiegis

Branch and bound (BnB) is a general algorithm to solve optimization problems. We present a template implementation of the BnB paradigm. A BnB template is implemented using C++ object oriented paradigm. MPI is used for underlying communications. A paradigm of domain decomposition (data parallelization) is used to construct a parallel algorithm. To obtain a better load balancing, the BnB template has the load balancing module that allows the redistribution of search spaces among the processors at run time. A parallel version of user's algorithm is obtained automatically. A new derivative-free global optimization algorithm is proposed for solving nonlinear global optimization problems. It is based on the BnB algorithm and its implementation is done by using the developed BnB algorithm template library. The robustness of the new algorithm is demonstrated by solving a selection of test problems.


2017 ◽  
Vol 95 (3) ◽  
pp. 500-511 ◽  
Author(s):  
XIAOWEI FANG ◽  
QIN NI

We propose a new derivative-free conjugate gradient method for large-scale nonlinear systems of equations. The method combines the Rivaie–Mustafa–Ismail–Leong conjugate gradient method for unconstrained optimisation problems and a new nonmonotone line-search method. The global convergence of the proposed method is established under some mild assumptions. Numerical results using 104 test problems from the CUTEst test problem library show that the proposed method is promising.


2018 ◽  
Vol 2018 ◽  
pp. 1-13
Author(s):  
Jing Gao ◽  
Jian Cao ◽  
Yueting Yang

We propose a derivative-free trust region algorithm with a nonmonotone filter technique for bound constrained optimization. The derivative-free strategy is applied for special minimization functions in which derivatives are not all available. A nonmonotone filter technique ensures not only the trust region feature but also the global convergence under reasonable assumptions. Numerical experiments demonstrate that the new algorithm is effective for bound constrained optimization. Locally, optimal parameters with respect to overall computational time on a set of test problems are identified. The performance of the best choice of parameter values obtained by the algorithm we presented which differs from traditionally used values indicates that the algorithm proposed in this paper has a certain advantage for the nondifferentiable optimization problems.


2020 ◽  
Vol 3 (1) ◽  
pp. 43-49
Author(s):  
M K Dauda

In this study, a fully derivative-free method for solving large scale nonlinear systems of equations via memoryless DFP update is presented. The new proposed method is an enhanced DFP (Davidon-FletcherPowell) update which is matrix and derivative free thereby require low memory storage. Under suitable conditions, the proposed method converges globally. Numerical comparisons using a set of large-scale test problems showed that the proposed method is efficient.


Symmetry ◽  
2019 ◽  
Vol 11 (12) ◽  
pp. 1452 ◽  
Author(s):  
Janak Raj Sharma ◽  
Sunil Kumar ◽  
Lorentz Jäntschi

Many optimal order multiple root techniques involving derivatives have been proposed in literature. On the contrary, optimal order multiple root techniques without derivatives are almost nonexistent. With this as a motivational factor, here we develop a family of optimal fourth-order derivative-free iterative schemes for computing multiple roots. The procedure is based on two steps of which the first is Traub–Steffensen iteration and second is Traub–Steffensen-like iteration. Theoretical results proved for particular cases of the family are symmetric to each other. This feature leads us to prove the general result that shows the fourth-order convergence. Efficacy is demonstrated on different test problems that verifies the efficient convergent nature of the new methods. Moreover, the comparison of performance has proven the presented derivative-free techniques as good competitors to the existing optimal fourth-order methods that use derivatives.


Sign in / Sign up

Export Citation Format

Share Document