Benchmark Study of Five Optimization Algorithms for Weather Routing

Author(s):  
Helong Wang ◽  
Wengang Mao ◽  
Leif Eriksson

Safety and energy efficiency are two of the key issues in the maritime transport community. A sail plan system, which combines the concepts of weather routing and voyage optimization, are recognized by the shipping industry as an efficient measure to ensure a ship’s safety, gain more economic benefit, and reduce negative effects on our environment. In such a system, the key component is to develop a proper optimization algorithm to generate potential ship routes between a ship’s departure and destination. In the weather routing market, four routing optimization algorithms are commonly used. They are the so-called modified Isochrone and Isopone methods, dynamic programming, threedimensional dynamic programming, and Dijkstra’s algorithm, respectively. Each optimization algorithm has its own advantages and disadvantages to estimate a ship routing with shortest sailing time or/and minimum fuel consumption. This paper will present a benchmark study that compare these algorithms for routing optimization aiming at minimum fuel consumption. A merchant ship sailing in the North Atlantic with full-scale performance measurements, are employed as the case study vessels for the comparison. The ship’s speed/power performance is based on the ISO2015 methods combined with the measurement data. It is expected to demonstrate the pros and cons of different algorithms for the ship’s sail planning.

2021 ◽  
Vol 11 (10) ◽  
pp. 4382
Author(s):  
Ali Sadeghi ◽  
Sajjad Amiri Doumari ◽  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Pavel Trojovský ◽  
...  

Optimization is the science that presents a solution among the available solutions considering an optimization problem’s limitations. Optimization algorithms have been introduced as efficient tools for solving optimization problems. These algorithms are designed based on various natural phenomena, behavior, the lifestyle of living beings, physical laws, rules of games, etc. In this paper, a new optimization algorithm called the good and bad groups-based optimizer (GBGBO) is introduced to solve various optimization problems. In GBGBO, population members update under the influence of two groups named the good group and the bad group. The good group consists of a certain number of the population members with better fitness function than other members and the bad group consists of a number of the population members with worse fitness function than other members of the population. GBGBO is mathematically modeled and its performance in solving optimization problems was tested on a set of twenty-three different objective functions. In addition, for further analysis, the results obtained from the proposed algorithm were compared with eight optimization algorithms: genetic algorithm (GA), particle swarm optimization (PSO), gravitational search algorithm (GSA), teaching–learning-based optimization (TLBO), gray wolf optimizer (GWO), and the whale optimization algorithm (WOA), tunicate swarm algorithm (TSA), and marine predators algorithm (MPA). The results show that the proposed GBGBO algorithm has a good ability to solve various optimization problems and is more competitive than other similar algorithms.


Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1190
Author(s):  
Mohammad Dehghani ◽  
Zeinab Montazeri ◽  
Štěpán Hubálovský

There are many optimization problems in the different disciplines of science that must be solved using the appropriate method. Population-based optimization algorithms are one of the most efficient ways to solve various optimization problems. Population-based optimization algorithms are able to provide appropriate solutions to optimization problems based on a random search of the problem-solving space without the need for gradient and derivative information. In this paper, a new optimization algorithm called the Group Mean-Based Optimizer (GMBO) is presented; it can be applied to solve optimization problems in various fields of science. The main idea in designing the GMBO is to use more effectively the information of different members of the algorithm population based on two selected groups, with the titles of the good group and the bad group. Two new composite members are obtained by averaging each of these groups, which are used to update the population members. The various stages of the GMBO are described and mathematically modeled with the aim of being used to solve optimization problems. The performance of the GMBO in providing a suitable quasi-optimal solution on a set of 23 standard objective functions of different types of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal is evaluated. In addition, the optimization results obtained from the proposed GMBO were compared with eight other widely used optimization algorithms, including the Marine Predators Algorithm (MPA), the Tunicate Swarm Algorithm (TSA), the Whale Optimization Algorithm (WOA), the Grey Wolf Optimizer (GWO), Teaching–Learning-Based Optimization (TLBO), the Gravitational Search Algorithm (GSA), Particle Swarm Optimization (PSO), and the Genetic Algorithm (GA). The optimization results indicated the acceptable performance of the proposed GMBO, and, based on the analysis and comparison of the results, it was determined that the GMBO is superior and much more competitive than the other eight algorithms.


2013 ◽  
Vol 341-342 ◽  
pp. 896-900
Author(s):  
Bao Jiang Sun ◽  
Yue Xu

Describes briefly ultrasonic positioning system (UPS) and digital magnetic compass (DMC) heading measurement principle,analyzed the advantages and disadvantages of each option. To improve the accuracy of the heading measurement, As the theoretical basis of adaptive Kalman filter, designed a kind of ups and dmc integrated navigation system. Based on both real measurement data, made a simulation experiment and confirmed the feasibility of the navigation system.


This paper discusses various optimization algorithm design techniques. So, optimization techniques which are discussed in this paper are greedy method, dynamic programming and branch and bound. Problem comes under optimization are used to find either maximum or minimum. All these techniques we have multiple inputs and some constraints and we have to find feasible solution using these inputs and constraints. In greedy method we follow some predefined method. Using that predefined method, we reach to the solution. On contrary to this in dynamic programming we take decision at every step and in the end we reach to the solution. In branch and bound we create state space tree and explore all possibilities of live node. Based on some constraint we start killing some alive nodes. Here, firstly I will discuss all the design techniques. Then types of problems that can be solved under each design techniques and their time complexities.


Sign in / Sign up

Export Citation Format

Share Document