An Efficient Solution to Structured Optimization Problems using Recursive Matrices

2019 ◽  
Vol 38 (8) ◽  
pp. 33-39 ◽  
Author(s):  
D. Rückert ◽  
M. Stamminger
1988 ◽  
Vol 42 (1-3) ◽  
pp. 471-487 ◽  
Author(s):  
Carl Brezovec ◽  
Gérard Cornuéjols ◽  
Fred Glover

Author(s):  
Megha Vora ◽  
T. T. Mirnalinee

In the past two decades, Swarm Intelligence (SI)-based optimization techniques have drawn the attention of many researchers for finding an efficient solution to optimization problems. Swarm intelligence techniques are characterized by their decentralized way of working that mimics the behavior of colony of ants, swarm of bees, flock of birds, or school of fishes. Algorithmic simplicity and effectiveness of swarm intelligence techniques have made it a powerful tool for solving global optimization problems. Simulation studies of the graceful, but unpredictable, choreography of bird flocks led to the design of the particle swarm optimization algorithm. Studies of the foraging behavior of ants resulted in the development of ant colony optimization algorithm. This chapter provides insight into swarm intelligence techniques, specifically particle swarm optimization and its variants. The objective of this chapter is twofold: First, it describes how swarm intelligence techniques are employed to solve various optimization problems. Second, it describes how swarm intelligence techniques are efficiently applied for clustering, by imposing clustering as an optimization problem.


2016 ◽  
pp. 1519-1544 ◽  
Author(s):  
Megha Vora ◽  
T. T. Mirnalinee

In the past two decades, Swarm Intelligence (SI)-based optimization techniques have drawn the attention of many researchers for finding an efficient solution to optimization problems. Swarm intelligence techniques are characterized by their decentralized way of working that mimics the behavior of colony of ants, swarm of bees, flock of birds, or school of fishes. Algorithmic simplicity and effectiveness of swarm intelligence techniques have made it a powerful tool for solving global optimization problems. Simulation studies of the graceful, but unpredictable, choreography of bird flocks led to the design of the particle swarm optimization algorithm. Studies of the foraging behavior of ants resulted in the development of ant colony optimization algorithm. This chapter provides insight into swarm intelligence techniques, specifically particle swarm optimization and its variants. The objective of this chapter is twofold: First, it describes how swarm intelligence techniques are employed to solve various optimization problems. Second, it describes how swarm intelligence techniques are efficiently applied for clustering, by imposing clustering as an optimization problem.


2007 ◽  
Vol 2007 ◽  
pp. 1-11 ◽  
Author(s):  
Valeriano A. De Oliveira ◽  
Marko A. Rojas-Medar

We introduce some concepts of generalized invexity for the continuous-time multiobjective programming problems, namely, the concepts of Karush-Kuhn-Tucker invexity and Karush-Kuhn-Tucker pseudoinvexity. Using the concept of Karush-Kuhn-Tucker invexity, we study the relationship of the multiobjective problems with some related scalar problems. Further, we show that Karush-Kuhn-Tucker pseudoinvexity is a necessary and suffcient condition for a vector Karush-Kuhn-Tucker solution to be a weakly efficient solution.


Author(s):  
Le Thanh Tung

The main aim of this paper is to study second-order sensitivity analysis in parametric vector optimization problems. We prove that the proper perturbation maps and the proper efficient solution maps of parametric vector optimization problems are second-order composed proto-differentiable under some appropriate qualification conditions. Some examples are provided to illustrate our results.


Algorithms ◽  
2020 ◽  
Vol 13 (4) ◽  
pp. 91 ◽  
Author(s):  
Chunming Tang ◽  
Yanni Li ◽  
Xiaoxia Dong ◽  
Bo He

In this paper, we consider a class of structured optimization problems whose objective function is the summation of two convex functions: f and h, which are not necessarily differentiable. We focus particularly on the case where the function f is general and its exact first-order information (function value and subgradient) may be difficult to obtain, while the function h is relatively simple. We propose a generalized alternating linearization bundle method for solving this class of problems, which can handle inexact first-order information of on-demand accuracy. The inexact information can be very general, which covers various oracles, such as inexact, partially inexact and asymptotically exact oracles, and so forth. At each iteration, the algorithm solves two interrelated subproblems: one aims to find the proximal point of the polyhedron model of f plus the linearization of h; the other aims to find the proximal point of the linearization of f plus h. We establish global convergence of the algorithm under different types of inexactness. Finally, some preliminary numerical results on a set of two-stage stochastic linear programming problems show that our method is very encouraging.


Sign in / Sign up

Export Citation Format

Share Document