Reduction of Solution Space in the Automotive Safety Integrity Levels Allocation Problem

Author(s):  
Youcef Gheraibia ◽  
Khaoula Djafri ◽  
Habiba Krimou
Author(s):  
Dilip Kumar ◽  
Bibhudatta Sahoo ◽  
Tarni Mandal

The energy consumption in the cloud is proportional to the resource utilization and data centers are almost the world's highest consumers of electricity. The complexity of the resource allocation problem increases with the size of cloud infrastructure and becomes difficult to solve effectively. The exponential solution space for the resource allocation problem can be searched using heuristic techniques to obtain a sub-optimal solution at the acceptable time. This chapter presents the resource allocation problem in cloud computing as a linear programming problem, with the objective to minimize energy consumed in computation. This resource allocation problem has been treated using heuristic approaches. In particular, we have used two phase selection algorithm ‘FcfsRand', ‘FcfsRr', ‘FcfsMin', ‘FcfsMax', ‘MinMin', ‘MedianMin', ‘MaxMin', ‘MinMax', ‘MedianMax', and ‘MaxMax'. The simulation results indicate in the favor of MaxMax.


2016 ◽  
Vol 12 (1) ◽  
pp. 103-113 ◽  
Author(s):  
Mohammed Ibrahim ◽  
Haider AlSabbagh

A considerable work has been conducted to cope with orthogonal frequency division multiple access (OFDMA) resource allocation with using different algorithms and methods. However, most of the available studies deal with optimizing the system for one or two parameters with simple practical condition/constraints. This paper presents analyses and simulation of dynamic OFDMA resource allocation implementation with Modified Multi-Dimension Genetic Algorithm (MDGA) which is an extension for the standard algorithm. MDGA models the resource allocation problem to find the optimal or near optimal solution for both subcarrier and power allocation for OFDMA. It takes into account the power and subcarrier constrains, channel and noise distributions, distance between user's equipment (UE) and base stations (BS), user priority weight – to approximate the most effective parameters that encounter in OFDMA systems. In the same time multi dimension genetic algorithm is used to allow exploring the solution space of resource allocation problem effectively with its different evolutionary operators: multi dimension crossover, multi dimension mutation. Four important cases are addressed and analyzed for resource allocation of OFDMA system under specific operation scenarios to meet the standard specifications for different advanced communication systems. The obtained results demonstrate that MDGA is an effective algorithm in finding the optimal or near optimal solution for both of subcarrier and power allocation of OFDMA resource allocation.


2019 ◽  
Vol 18 (04) ◽  
pp. 527-548
Author(s):  
Arash Zaretalab ◽  
Vahid Hajipour

One of the most practical optimization problems in the reliability field is the redundancy allocation problem (RAP). This problem optimizes the reliability of a system by adding redundant components to subsystems under some constraints. In recent years, various meta-heuristic algorithms applied to find a local or global optimum solution for RAP in which redundancy strategies are chosen. Among these algorithms, simulated annealing algorithm (SA) is a capable one and makes use of a mathematical analogue to the physical annealing process to finding the global optimum. In this paper, we present a new simulated annealing algorithm named knowledge-based simulated annealing (KBSA) to solve RAP for the series-parallel system when the redundancy strategy can be chosen for individual subsystems. In the KBSA algorithm, the SA part searches the solution space to find good solutions and knowledge model saves the knowledge of good solution and feed it back to the algorithm. In this paper, this approach achieves the optimal result for some instances in the literature. In order to evaluate the performance of the proposed algorithm, it is compared with well-known algorithms in the literature for different test problems. Finally, the results illustrate that the proposed algorithm has a good proficiency in obtaining desired results.


2016 ◽  
pp. 760-782 ◽  
Author(s):  
Dilip Kumar ◽  
Bibhudatta Sahoo ◽  
Tarni Mandal

The energy consumption in the cloud is proportional to the resource utilization and data centers are almost the world's highest consumers of electricity. The complexity of the resource allocation problem increases with the size of cloud infrastructure and becomes difficult to solve effectively. The exponential solution space for the resource allocation problem can be searched using heuristic techniques to obtain a sub-optimal solution at the acceptable time. This chapter presents the resource allocation problem in cloud computing as a linear programming problem, with the objective to minimize energy consumed in computation. This resource allocation problem has been treated using heuristic approaches. In particular, we have used two phase selection algorithm ‘FcfsRand', ‘FcfsRr', ‘FcfsMin', ‘FcfsMax', ‘MinMin', ‘MedianMin', ‘MaxMin', ‘MinMax', ‘MedianMax', and ‘MaxMax'. The simulation results indicate in the favor of MaxMax.


Author(s):  
Mihalis M. Golias ◽  
Maria Boilé ◽  
Sotirios Theofanis ◽  
Heidi A. Taboada

Berth scheduling can be described as the resource allocation problem of berth space to vessels in a container terminal. When defining the allocation of berths to vessels container terminal operators set several objectives which ideally need to be optimized simultaneously. These multiple objectives are often non-commensurable and gaining an improvement on one objective often causes degrading performance on the other objectives. In this paper, the authors present the application of a multi-objective decision and analysis approach to the berth scheduling problem, a resource allocation problem at container terminals. The proposed approach allows the port operator to efficiently select a subset of solutions over the entire solution space of berth schedules when multiple and conflicting objectives are involved. Results from extensive computational examples using real-world data show that the proposed approach is able to construct and select efficient berth schedules, is consistent, and can be used with confidence.


Author(s):  
Mihalis M. Golias ◽  
Maria Boilé ◽  
Sotirios Theofanis ◽  
Heidi A. Taboada

Berth scheduling can be described as the resource allocation problem of berth space to vessels in a container terminal. When defining the allocation of berths to vessels container terminal operators set several objectives which ideally need to be optimized simultaneously. These multiple objectives are often non-commensurable and gaining an improvement on one objective often causes degrading performance on the other objectives. In this paper, the authors present the application of a multi-objective decision and analysis approach to the berth scheduling problem, a resource allocation problem at container terminals. The proposed approach allows the port operator to efficiently select a subset of solutions over the entire solution space of berth schedules when multiple and conflicting objectives are involved. Results from extensive computational examples using real-world data show that the proposed approach is able to construct and select efficient berth schedules, is consistent, and can be used with confidence.


Sign in / Sign up

Export Citation Format

Share Document