Studying interconnections between two classes of two-stage fuzzy optimization problems

2012 ◽  
Vol 17 (4) ◽  
pp. 569-578 ◽  
Author(s):  
Yankui Liu ◽  
Xuejie Bai

Author(s):  
Lu Chen ◽  
Handing Wang ◽  
Wenping Ma

AbstractReal-world optimization applications in complex systems always contain multiple factors to be optimized, which can be formulated as multi-objective optimization problems. These problems have been solved by many evolutionary algorithms like MOEA/D, NSGA-III, and KnEA. However, when the numbers of decision variables and objectives increase, the computation costs of those mentioned algorithms will be unaffordable. To reduce such high computation cost on large-scale many-objective optimization problems, we proposed a two-stage framework. The first stage of the proposed algorithm combines with a multi-tasking optimization strategy and a bi-directional search strategy, where the original problem is reformulated as a multi-tasking optimization problem in the decision space to enhance the convergence. To improve the diversity, in the second stage, the proposed algorithm applies multi-tasking optimization to a number of sub-problems based on reference points in the objective space. In this paper, to show the effectiveness of the proposed algorithm, we test the algorithm on the DTLZ and LSMOP problems and compare it with existing algorithms, and it outperforms other compared algorithms in most cases and shows disadvantage on both convergence and diversity.



2020 ◽  
Vol 77 (2) ◽  
pp. 539-569
Author(s):  
Nicolas Kämmerling ◽  
Jannis Kurtz

Abstract In this work we study binary two-stage robust optimization problems with objective uncertainty. We present an algorithm to calculate efficiently lower bounds for the binary two-stage robust problem by solving alternately the underlying deterministic problem and an adversarial problem. For the deterministic problem any oracle can be used which returns an optimal solution for every possible scenario. We show that the latter lower bound can be implemented in a branch and bound procedure, where the branching is performed only over the first-stage decision variables. All results even hold for non-linear objective functions which are concave in the uncertain parameters. As an alternative solution method we apply a column-and-constraint generation algorithm to the binary two-stage robust problem with objective uncertainty. We test both algorithms on benchmark instances of the uncapacitated single-allocation hub-location problem and of the capital budgeting problem. Our results show that the branch and bound procedure outperforms the column-and-constraint generation algorithm.



Author(s):  
Amir Ardestani-Jaafari ◽  
Erick Delage

In this article, we discuss an alternative method for deriving conservative approximation models for two-stage robust optimization problems. The method mainly relies on a linearization scheme employed in bilinear programming; therefore, we will say that it gives rise to the linearized robust counterpart models. We identify a close relation between this linearized robust counterpart model and the popular affinely adjustable robust counterpart model. We also describe methods of modifying both types of models to make these approximations less conservative. These methods are heavily inspired by the use of valid linear and conic inequalities in the linearization process for bilinear models. We finally demonstrate how to employ this new scheme in location-transportation and multi-item newsvendor problems to improve the numerical efficiency and performance guarantees of robust optimization.



Author(s):  
Weilin Nie ◽  
Cheng Wang

Abstract Online learning is a classical algorithm for optimization problems. Due to its low computational cost, it has been widely used in many aspects of machine learning and statistical learning. Its convergence performance depends heavily on the step size. In this paper, a two-stage step size is proposed for the unregularized online learning algorithm, based on reproducing Kernels. Theoretically, we prove that, such an algorithm can achieve a nearly min–max convergence rate, up to some logarithmic term, without any capacity condition.



2021 ◽  
Author(s):  
Mingxuan Zhao ◽  
Yulin Han ◽  
Jian Zhou

Abstract The operational law put forward by Zhou et al. on strictly monotone functions with regard to regular LR fuzzy numbers makes a valuable push to the development of fuzzy set theory. However, its applicable conditions are confined to strictly monotone functions and regular LR fuzzy numbers, which restricts its application in practice to a certain degree. In this paper, we propose an extensive operational law that generalizes the one proposed by Zhou et al. to apply to monotone (but not necessarily strictly monotone) functions with regard to regular LR fuzzy intervals (LR-FIs), of which regular fuzzy numbers can be regarded as particular cases. By means of the extensive operational law, the inverse credibility distributions (ICDs) of monotone functions regarding regular LR-FIs can be calculated efficiently and effectively. Moreover, the extensive operational law has a wider range of applications, which can deal with the situations hard to be handled by the original operational law. Subsequently, based on the extensive operational law, the computational formulae for expected values (EVs) of LR-FIs and monotone functions with regard to regular LR-FIs are presented. Furthermore, the proposed operational law is also applied to dispose fuzzy optimization problems with regular LR-FIs, for which a solution strategy is provided, where the fuzzy programming is converted to a deterministic equivalent first and then a newly-devised solution algorithm is utilized. Finally, the proposed solution strategy is applied to a purchasing planning problem, whose performances are evaluated by comparing with the traditional fuzzy simulation-based genetic algorithm. Experimental results indicate that our method is much more efficient, yielding high-quality solutions within a short time.



Author(s):  
Said Rahal ◽  
Zukui Li ◽  
Dimitri J. Papageorgiou


Sign in / Sign up

Export Citation Format

Share Document