scholarly journals SGN: Sparse Gauss-Newton for Accelerated Sensitivity Analysis

2022 ◽  
Vol 41 (1) ◽  
pp. 1-10
Author(s):  
Jonas Zehnder ◽  
Stelian Coros ◽  
Bernhard Thomaszewski

We present a sparse Gauss-Newton solver for accelerated sensitivity analysis with applications to a wide range of equilibrium-constrained optimization problems. Dense Gauss-Newton solvers have shown promising convergence rates for inverse problems, but the cost of assembling and factorizing the associated matrices has so far been a major stumbling block. In this work, we show how the dense Gauss-Newton Hessian can be transformed into an equivalent sparse matrix that can be assembled and factorized much more efficiently. This leads to drastically reduced computation times for many inverse problems, which we demonstrate on a diverse set of examples. We furthermore show links between sensitivity analysis and nonlinear programming approaches based on Lagrange multipliers and prove equivalence under specific assumptions that apply for our problem setting.

2020 ◽  
Vol 30 (6) ◽  
pp. 1645-1663
Author(s):  
Ömer Deniz Akyildiz ◽  
Dan Crisan ◽  
Joaquín Míguez

Abstract We introduce and analyze a parallel sequential Monte Carlo methodology for the numerical solution of optimization problems that involve the minimization of a cost function that consists of the sum of many individual components. The proposed scheme is a stochastic zeroth-order optimization algorithm which demands only the capability to evaluate small subsets of components of the cost function. It can be depicted as a bank of samplers that generate particle approximations of several sequences of probability measures. These measures are constructed in such a way that they have associated probability density functions whose global maxima coincide with the global minima of the original cost function. The algorithm selects the best performing sampler and uses it to approximate a global minimum of the cost function. We prove analytically that the resulting estimator converges to a global minimum of the cost function almost surely and provide explicit convergence rates in terms of the number of generated Monte Carlo samples and the dimension of the search space. We show, by way of numerical examples, that the algorithm can tackle cost functions with multiple minima or with broad “flat” regions which are hard to minimize using gradient-based techniques.


2016 ◽  
Author(s):  
Vineet Yadav ◽  
Anna M. Michalak

Abstract. Matrix multiplication of two sparse matrices is a fundamental operation in linear Bayesian inverse problems for computing covariance matrices of observations and a posteriori uncertainties. Applications of sparse-sparse matrix multiplication algorithms for specific use-cases in such inverse problems remain unexplored. Here we present a hybrid-parallel sparse-sparse matrix multiplication approach that is more efficient by a third in terms of execution time and operation count relative to standard sparse matrix multiplication algorithms available in most libraries. Two modifications of this hybrid-parallel algorithm are also proposed for the types of operations typical of atmospheric inverse problems, which further reduce the cost of sparse matrix multiplication by yielding only upper triangular and/or dense matrices.


Author(s):  
Miguel Terra-Neves ◽  
Inês Lynce ◽  
Vasco Manquinho

A Minimal Correction Subset (MCS) of an unsatisfiable constraint set is a minimal subset of constraints that, if removed, makes the constraint set satisfiable. MCSs enjoy a wide range of applications, such as finding approximate solutions to constrained optimization problems. However, existing work on applying MCS enumeration to optimization problems focuses on the single-objective case. In this work, Pareto Minimal Correction Subsets (Pareto-MCSs) are proposed for approximating the Pareto-optimal solution set of multi-objective constrained optimization problems. We formalize and prove an equivalence relationship between Pareto-optimal solutions and Pareto-MCSs. Moreover, Pareto-MCSs and MCSs can be connected in such a way that existing state-of-the-art MCS enumeration algorithms can be used to enumerate Pareto-MCSs. Finally, experimental results on the multi-objective virtual machine consolidation problem show that the Pareto-MCS approach is competitive with state-of-the-art algorithms.


2017 ◽  
Vol 139 (4) ◽  
Author(s):  
Kalyan Shankar Bhattacharjee ◽  
Hemant Kumar Singh ◽  
Tapabrata Ray

In recent years, evolutionary algorithms based on the concept of “decomposition” have gained significant attention for solving multi-objective optimization problems. They have been particularly instrumental in solving problems with four or more objectives, which are further classified as many-objective optimization problems. In this paper, we first review the cause-effect relationships introduced by commonly adopted schemes in such algorithms. Thereafter, we introduce a decomposition-based evolutionary algorithm with a novel assignment scheme. The scheme eliminates the need for any additional replacement scheme, while ensuring diversity among the population of candidate solutions. Furthermore, to deal with constrained optimization problems efficiently, marginally infeasible solutions are preserved to aid search in promising regions of interest. The performance of the algorithm is objectively evaluated using a number of benchmark and practical problems, and compared with a number of recent algorithms. Finally, we also formulate a practical many-objective problem related to wind-farm layout optimization and illustrate the performance of the proposed approach on it. The numerical experiments clearly highlight the ability of the proposed algorithm to deliver the competitive results across a wide range of multi-/many-objective design optimization problems.


2013 ◽  
Vol 457-458 ◽  
pp. 1283-1287 ◽  
Author(s):  
Si Ling Feng ◽  
Qing Xin Zhu ◽  
Sheng Zhong ◽  
Xiu Jun Gong

Biogeography-based optimization (BBO) is a new biogeography inspired algorithm. It mainly uses the biogeography-based migration operator to share the information among solution. Differential evolution (DE) is a fast and robust evolutionary algorithm for global optimization. In this paper, we applied a hybridization of adaptive BBO with DE approach, namely ABBO/DE/GEN, for the global numerical optimization problems. ABBO/DE/GEN adaptively changes migration probability and mutation probability based on the relation between the cost of fitness function and average cost every generation, and the mutation operators of BBO were modified based on DE algorithm and the migration operators of BBO were modified based on number of iteration to improve performance. And hence it can generate the promising candidate solutions. To verify the performance of our proposed ABBO/DE/GEN, 9 benchmark functions with a wide range of dimensions and diverse complexities are employed. Experimental results indicate that our approach is effective and efficient. Compared with BBO/DE/GEN approaches, ABBO/DE/GEN performs better, or at least comparably, in terms of the quality of the final solutions and the convergence rate.


2014 ◽  
Vol 2014 ◽  
pp. 1-7
Author(s):  
Yong-Hong Ren

Nonlinear Lagrangian algorithm plays an important role in solving constrained optimization problems. It is known that, under appropriate conditions, the sequence generated by the first-order multiplier iteration converges superlinearly. This paper aims at analyzing the second-order multiplier iteration based on a class of nonlinear Lagrangians for solving nonlinear programming problems with inequality constraints. It is suggested that the sequence generated by the second-order multiplier iteration converges superlinearly with order at least two if in addition the Hessians of functions involved in problem are Lipschitz continuous.


Sign in / Sign up

Export Citation Format

Share Document