A DIRECT SEARCH QUASI-NEWTON METHOD FOR NONSMOOTH UNCONSTRAINED OPTIMIZATION

2017 ◽  
Vol 59 (2) ◽  
pp. 215-231 ◽  
Author(s):  
C. J. PRICE

A direct search quasi-Newton algorithm is presented for local minimization of Lipschitz continuous black-box functions. The method estimates the gradient via central differences using a maximal frame around each iterate. When nonsmoothness prevents progress, a global direction search is used to locate a descent direction. Almost sure convergence to Clarke stationary point(s) is shown, where convergence is independent of the accuracy of the gradient estimates. Numerical results show that the method is effective in practice.

Author(s):  
Marcus Pettersson ◽  
Johan O¨lvander

Box’s Complex method for direct search has shown promise when applied to simulation based optimization. In direct search methods, like Box’s Complex method, the search starts with a set of points, where each point is a solution to the optimization problem. In the Complex method the number of points must be at least one plus the number of variables. However, in order to avoid premature termination and increase the likelihood of finding the global optimum more points are often used at the expense of the required number of evaluations. The idea in this paper is to gradually remove points during the optimization in order to achieve an adaptive Complex method for more efficient design optimization. The proposed method shows encouraging results when compared to the Complex method with fix number of points and a quasi-Newton method.


2019 ◽  
Vol 31 (4) ◽  
pp. 689-702 ◽  
Author(s):  
Juliane Müller ◽  
Marcus Day

We introduce the algorithm SHEBO (surrogate optimization of problems with hidden constraints and expensive black-box objectives), an efficient optimization algorithm that employs surrogate models to solve computationally expensive black-box simulation optimization problems that have hidden constraints. Hidden constraints are encountered when the objective function evaluation does not return a value for a parameter vector. These constraints are often encountered in optimization problems in which the objective function is computed by a black-box simulation code. SHEBO uses a combination of local and global search strategies together with an evaluability prediction function and a dynamically adjusted evaluability threshold to iteratively select new sample points. We compare the performance of our algorithm with that of the mesh-based algorithms mesh adaptive direct search (MADS, NOMAD [nonlinear optimization by mesh adaptive direct search] implementation) and implicit filtering and SNOBFIT (stable noisy optimization by branch and fit), which assigns artificial function values to points that violate the hidden constraints. Our numerical experiments for a large set of test problems with 2–30 dimensions and a 31-dimensional real-world application problem arising in combustion simulation show that SHEBO is an efficient solver that outperforms the other methods for many test problems.


2020 ◽  
Vol 34 (06) ◽  
pp. 10126-10135
Author(s):  
Artyom Gadetsky ◽  
Kirill Struminsky ◽  
Christopher Robinson ◽  
Novi Quadrianto ◽  
Dmitry Vetrov

Learning models with discrete latent variables using stochastic gradient descent remains a challenge due to the high variance of gradient estimates. Modern variance reduction techniques mostly consider categorical distributions and have limited applicability when the number of possible outcomes becomes large. In this work, we consider models with latent permutations and propose control variates for the Plackett-Luce distribution. In particular, the control variates allow us to optimize black-box functions over permutations using stochastic gradient descent. To illustrate the approach, we consider a variety of causal structure learning tasks for continuous and discrete data. We show that our method outperforms competitive relaxation-based optimization methods and is also applicable to non-differentiable score functions.


1998 ◽  
Vol 50 (6) ◽  
pp. 1163-1175 ◽  
Author(s):  
Jingyi Chen ◽  
Elton P. Hsu

AbstractWe introduce a distributional Ricci curvature on complete smooth manifolds with Lipschitz continuous metrics. Under an assumption on the volume growth of geodesics balls, we obtain a gradient estimate for weakly harmonic functions if the distributional Ricci curvature is bounded below.


2021 ◽  
Vol 78 (3) ◽  
pp. 705-740
Author(s):  
Caroline Geiersbach ◽  
Teresa Scarinci

AbstractFor finite-dimensional problems, stochastic approximation methods have long been used to solve stochastic optimization problems. Their application to infinite-dimensional problems is less understood, particularly for nonconvex objectives. This paper presents convergence results for the stochastic proximal gradient method applied to Hilbert spaces, motivated by optimization problems with partial differential equation (PDE) constraints with random inputs and coefficients. We study stochastic algorithms for nonconvex and nonsmooth problems, where the nonsmooth part is convex and the nonconvex part is the expectation, which is assumed to have a Lipschitz continuous gradient. The optimization variable is an element of a Hilbert space. We show almost sure convergence of strong limit points of the random sequence generated by the algorithm to stationary points. We demonstrate the stochastic proximal gradient algorithm on a tracking-type functional with a $$L^1$$ L 1 -penalty term constrained by a semilinear PDE and box constraints, where input terms and coefficients are subject to uncertainty. We verify conditions for ensuring convergence of the algorithm and show a simulation.


2018 ◽  
Vol 6 (3) ◽  
pp. 414-428 ◽  
Author(s):  
Thomas Wortmann

Abstract This article presents benchmark results from seven simulation-based problems from structural, building energy, and daylight optimization. Growing applications of parametric design and performance simulations in architecture, engineering, and construction allow the harnessing of simulation-based, or black-box, optimization in the search for less resource- and/or energy consuming designs. In architectural design optimization (ADO) practice and research, the most commonly applied black-box algorithms are genetic algorithms or other metaheuristics, to the neglect of more current, global direct search or model-based, methods. Model-based methods construct a surrogate model (i.e., an approximation of a fitness landscape) that they refine during the optimization process. This benchmark compares metaheuristic, direct search, and model-based methods, and concludes that, for the given evaluation budget and problems, the model-based method (RBFOpt) is the most efficient and robust, while the tested genetic algorithms perform poorly. As such, this article challenges the popularity of genetic algorithms in ADO, as well as the practice of using them for one-to-one comparisons to justify algorithmic innovations. Highlights Benchmarks optimization algorithms on structural, energy, and daylighting problems. Benchmarks metaheuristic, direct search, and model-based optimization methods. Challenges the popularity of genetic algorithms in architectural design optimization. Presents model-based methods as a more efficient and reliable alternative.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Pierre Bousquet ◽  
Lorenzo Brasco ◽  
Chiara Leone ◽  
Anna Verde

Abstract We consider a quasilinear degenerate parabolic equation driven by the orthotropic p-Laplacian. We prove that local weak solutions are locally Lipschitz continuous in the spatial variable, uniformly in time.


2014 ◽  
Vol 279 ◽  
pp. 113-132 ◽  
Author(s):  
A.E.J. Bogaers ◽  
S. Kok ◽  
B.D. Reddy ◽  
T. Franz

Sign in / Sign up

Export Citation Format

Share Document