scholarly journals The Space Decomposition Theory for a Class of Semi-Infinite Maximum Eigenvalue Optimizations

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Ming Huang ◽  
Li-Ping Pang ◽  
Xi-Jun Liang ◽  
Zun-Quan Xia

We study optimization problems involving eigenvalues of symmetric matrices. We present a nonsmooth optimization technique for a class of nonsmooth functions which are semi-infinite maxima of eigenvalue functions. Our strategy uses generalized gradients and𝒰𝒱space decomposition techniques suited for the norm and other nonsmooth performance criteria. For the class of max-functions, which possesses the so-called primal-dual gradient structure, we compute smooth trajectories along which certain second-order expansions can be obtained. We also give the first- and second-order derivatives of primal-dual function in the space of decision variablesRmunder some assumptions.

2015 ◽  
Vol 2015 ◽  
pp. 1-10
Author(s):  
Jerico B. Bacani ◽  
Julius Fergy T. Rabago

The exterior Bernoulli free boundary problem was studied via shape optimization technique. The problem was reformulated into the minimization of the so-called Kohn-Vogelius objective functional, where two state variables involved satisfy two boundary value problems, separately. The paper focused on solving the second-order shape derivative of the objective functional using the velocity method with nonautonomous velocity fields. This work confirms the classical results of Delfour and Zolésio in relating shape derivatives of functionals using velocity method and perturbation of identity technique.


2013 ◽  
Vol 61 (2) ◽  
pp. 135-140
Author(s):  
M Babul Hasan ◽  
Md Toha

The objective of this paper is to improve the subgradient optimization method which is used to solve non-differentiable optimization problems in the Lagrangian dual problem. One of the main drawbacks of the subgradient method is the tuning process to determine the sequence of step-lengths to update successive iterates. In this paper, we propose a modified subgradient optimization method with various step size rules to compute a tuning- free subgradient step-length that is geometrically motivated and algebraically deduced. It is well known that the dual function is a concave function over its domain (regardless of the structure of the cost and constraints of the primal problem), but not necessarily differentiable. We solve the dual problem whenever it is easier to solve than the primal problem with no duality gap. However, even if there is a duality gap the solution of the dual problem provides a lower bound to the primal optimum that can be useful in combinatorial optimization. Numerical examples are illustrated to demonstrate the method. DOI: http://dx.doi.org/10.3329/dujs.v61i2.17059 Dhaka Univ. J. Sci. 61(2): 135-140, 2013 (July)


2020 ◽  
Vol 37 (04) ◽  
pp. 2040011
Author(s):  
Qilin Wang ◽  
Xiaoyan Zhang

In this paper, we introduce second-order composed radial derivatives of set-valued maps and establish some of its properties. By applying this second-order derivative, we obtain second-order sensitivity results for parametric multi-objective optimization problems under the Benson proper efficiency without assumptions of cone-convexity and Lipschitz continuity. Some of our results improve and derive the recent corresponding ones in the literature.


Author(s):  
Phạm Lê Bạch Ngọc ◽  
Nguyen Thanh Tung ◽  
Nguyen Huynh Nghia

In the paper, we study the generalized differentiability in set-valued optimization, namely stydying the second-order composed radial derivative of a given set-valued mapping. Inspired by the adjacent cone and the higher-order radial con in Anh NLH et al. (2011), we introduce the second-order composed radial derivative.  Then, its basic properties are investigated and relationships between the second-order compsoed radial derivative of a given set-valued mapping and that of its profile are obtained. Finally, applications of this derivative to sensitivity analysis are studied. In detail, we work on a parametrized set-valued optimization problem concerning Pareto solutions.  Based on the above-mentioned results, we find out sensitivity analysis for Pareto solution mapping of the problem. More precisely, we establish the second-order composed radial derivative for the perturbation mapping (here, the perturbation means the Pareto solution mapping concerning some parameter). Some examples are given to illustrate our results. The obtained results are new and improve the existing ones in the literature.


Author(s):  
Jaya Pratha Sebastiyar ◽  
Martin Sahayaraj Joseph

Distributed joint congestion control and routing optimization has received a significant amount of attention recently. To date, however, most of the existing schemes follow a key idea called the back-pressure algorithm. Despite having many salient features, the first-order sub gradient nature of the back-pressure based schemes results in slow convergence and poor delay performance. To overcome these limitations, the present study was made as first attempt at developing a second-order joint congestion control and routing optimization framework that offers utility-optimality, queue-stability, fast convergence, and low delay.  Contributions in this project are three-fold. The present study propose a new second-order joint congestion control and routing framework based on a primal-dual interior-point approach and established utility-optimality and queue-stability of the proposed second-order method. The results of present study showed that how to implement the proposed second-order method in a distributed fashion.


Author(s):  
Lu Chen ◽  
Handing Wang ◽  
Wenping Ma

AbstractReal-world optimization applications in complex systems always contain multiple factors to be optimized, which can be formulated as multi-objective optimization problems. These problems have been solved by many evolutionary algorithms like MOEA/D, NSGA-III, and KnEA. However, when the numbers of decision variables and objectives increase, the computation costs of those mentioned algorithms will be unaffordable. To reduce such high computation cost on large-scale many-objective optimization problems, we proposed a two-stage framework. The first stage of the proposed algorithm combines with a multi-tasking optimization strategy and a bi-directional search strategy, where the original problem is reformulated as a multi-tasking optimization problem in the decision space to enhance the convergence. To improve the diversity, in the second stage, the proposed algorithm applies multi-tasking optimization to a number of sub-problems based on reference points in the objective space. In this paper, to show the effectiveness of the proposed algorithm, we test the algorithm on the DTLZ and LSMOP problems and compare it with existing algorithms, and it outperforms other compared algorithms in most cases and shows disadvantage on both convergence and diversity.


2021 ◽  
Vol 11 (8) ◽  
pp. 3430
Author(s):  
Erik Cuevas ◽  
Héctor Becerra ◽  
Héctor Escobar ◽  
Alberto Luque-Chang ◽  
Marco Pérez ◽  
...  

Recently, several new metaheuristic schemes have been introduced in the literature. Although all these approaches consider very different phenomena as metaphors, the search patterns used to explore the search space are very similar. On the other hand, second-order systems are models that present different temporal behaviors depending on the value of their parameters. Such temporal behaviors can be conceived as search patterns with multiple behaviors and simple configurations. In this paper, a set of new search patterns are introduced to explore the search space efficiently. They emulate the response of a second-order system. The proposed set of search patterns have been integrated as a complete search strategy, called Second-Order Algorithm (SOA), to obtain the global solution of complex optimization problems. To analyze the performance of the proposed scheme, it has been compared in a set of representative optimization problems, including multimodal, unimodal, and hybrid benchmark formulations. Numerical results demonstrate that the proposed SOA method exhibits remarkable performance in terms of accuracy and high convergence rates.


Sign in / Sign up

Export Citation Format

Share Document