scholarly journals A Hybrid Metaheuristic for Multiple Runways Aircraft Landing Problem Based on Bat Algorithm

2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Jian Xie ◽  
Yongquan Zhou ◽  
Hongqing Zheng

The aircraft landing problem (ALP) is an NP-hard problem; the aim of ALP is to minimize the total cost of landing deviation from predefined target time under the condition of safe landing. In this paper, the multiple runways case of the static ALP is considered and a hybrid metaheuristic based on bat algorithm is presented to solve it. Moreover, four types of landing time assignment strategies are applied to allocate the scheduling time, and a constructed initialization is used to speed up the convergence rate. The computational results show that the proposed algorithm can obtain the high-quality and comparable solutions for instances up to 500 aircrafts, and also it is capable of finding the optimal solutions for many instances in a short time.

2013 ◽  
Vol 645 ◽  
pp. 290-294 ◽  
Author(s):  
Hua Ping Wu ◽  
Min Huang ◽  
Xing Wei Wang

This paper investigates single machine scheduling problem with unequal release times and deterioration jobs where the objective function of it is minimal makespan. A mixed integer mathematical programming optimization model is developed for the problem which belongs to the NP-hard problem. The model is tested on examples and compared with a heuristic algorithm introduced by Lee et al.. Moreover, the branch-bound algorithm proposed by Lee et al. also can obtained optimal solutions the same as the results from CPLEX but the CPU time of it for 28 jobs even needs more than 2 hours, thus, the results between them are not compared. According to results of computational tests, it is showed that the proposed model is very effective in solving problems because it can obtain optimal solutions within a short time. Therefore, it is very useful and valuable for decision maker who requires the optimal solutions.


Author(s):  
Ruiyang Song ◽  
Kuang Xu

We propose and analyze a temporal concatenation heuristic for solving large-scale finite-horizon Markov decision processes (MDP), which divides the MDP into smaller sub-problems along the time horizon and generates an overall solution by simply concatenating the optimal solutions from these sub-problems. As a “black box” architecture, temporal concatenation works with a wide range of existing MDP algorithms. Our main results characterize the regret of temporal concatenation compared to the optimal solution. We provide upper bounds for general MDP instances, as well as a family of MDP instances in which the upper bounds are shown to be tight. Together, our results demonstrate temporal concatenation's potential of substantial speed-up at the expense of some performance degradation.


2018 ◽  
Vol 9 (4) ◽  
pp. 22-36
Author(s):  
Mohammed Mahseur ◽  
Abdelmadjid Boukra ◽  
Yassine Meraihi

Multicast routing is the problem of finding the spanning tree of a set of destinations whose roots are the source node and its leaves are the set of destination nodes by optimizing a set of quality of service parameters and satisfying a set of transmission constraints. This article proposes a new hybrid multicast algorithm called Hybrid Multi-objective Multicast Algorithm (HMMA) based on the Strength Pareto Evolutionary Algorithm (SPEA) to evaluate and classify the population in dominated solutions and non-dominated solutions. Dominated solutions are evolved by the Bat Algorithm, and non-dominated solutions are evolved by the Firefly Algorithm. Old and weak solutions are replaced by new random solutions by a process of mutation. The simulation results demonstrate that the proposed algorithm is able to find good Pareto optimal solutions compared to other algorithms.


2019 ◽  
pp. 427-434
Author(s):  

Modern development of printing technologies, which allows for short time and with high quality to represent different properties of visible images, instrumental in the improvement of existent methods of imitation of documents. Therefore, with the purpose of high-quality lead through of technical examination of documents it is necessary read-throughs of permanent analysis of information about found out the methods of imitation. Analysis of results forensic science Forensic document center State border guard service of Ukraine testifies to the increase of cases of the use of counterfeit passport documents (passports of Israel, Turkey, France, Sweden) with the inflicted photo of bearer on laminate tape which covers the page of information. Thus, closeness of layer of dye which was used for causing of the second photo did not enable to discover primary in the infra-red rays of light. The second photo was inflicted by finishing printing over laminate tape with the use of stream seal, without sorting out of book block. The analysis of technical descriptions of modern facilities of the contactless printing to show that for the imitation of passports of citizens, countries listed above, used technology of the so-called «UF-printing». Its feature is application in the process of seal of effect of polymerization blackened (paints) under the action of ultraviolet rays. Such technology enables, including polymeric tapes to inflict any images practically on any material, and also allows to create high-quality relief images. Taking into account possibilities technologists of «UF-printing», it can be used with the purpose of high-quality imitation of deep print and relief stamping. Application of this technology for the imitation of documents can complicate finding out imitations during realization of border control, especially, in the clock of peak-loads in the points of admission through a state boundary. And also increases economic feasibilities of offenders for realization of imitation of different documents. In modern terms at the improvement of facilities of defense of documents, the methods of their imitation will be perfected constantly. There is a problem of shortage of information about such imitations. Therefore the actual is become by the necessity of modernization of existing and development of new informative materials in relation to the technological features of making of different sort of documents with the use of methods seals which determine properties of visible in a document image. Key words: passport documents, printing technologies, fake.


2015 ◽  
Vol 3 ◽  
pp. 348-355
Author(s):  
Jaroslava Kniežová

In modern times, competitiveness in the market depends on having a good information system. The companies developing and supplying information systems are in competition too, and having an effective system of delivery is critical for obtaining lucrative offers. Therefore, the software development companies continuously try to improve their development process to supply the product in a short time and with high quality. The agile approach potentially shortens this time and is very often used. This approach has almost replaced the traditional process. More and more companies implement agile approach in these times to be competitive in the software development market and hasten product delivering.The traditional and agile approaches differ in certain perspectives. Hence, the question arises as to whether the agile approach is the best for the software development company in every case. This article contains a comparison of these two approaches, as well as a case study relating to the agile approach in a real software development company, which had previously used the traditional approach. The article also describes situation where replacing the traditional approach with agile would improve results.


2022 ◽  
Vol 40 (4) ◽  
pp. 1-45
Author(s):  
Weiren Yu ◽  
Julie McCann ◽  
Chengyuan Zhang ◽  
Hakan Ferhatosmanoglu

SimRank is an attractive link-based similarity measure used in fertile fields of Web search and sociometry. However, the existing deterministic method by Kusumoto et al. [ 24 ] for retrieving SimRank does not always produce high-quality similarity results, as it fails to accurately obtain diagonal correction matrix  D . Moreover, SimRank has a “connectivity trait” problem: increasing the number of paths between a pair of nodes would decrease its similarity score. The best-known remedy, SimRank++ [ 1 ], cannot completely fix this problem, since its score would still be zero if there are no common in-neighbors between two nodes. In this article, we study fast high-quality link-based similarity search on billion-scale graphs. (1) We first devise a “varied- D ” method to accurately compute SimRank in linear memory. We also aggregate duplicate computations, which reduces the time of [ 24 ] from quadratic to linear in the number of iterations. (2) We propose a novel “cosine-based” SimRank model to circumvent the “connectivity trait” problem. (3) To substantially speed up the partial-pairs “cosine-based” SimRank search on large graphs, we devise an efficient dimensionality reduction algorithm, PSR # , with guaranteed accuracy. (4) We give mathematical insights to the semantic difference between SimRank and its variant, and correct an argument in [ 24 ] that “if D is replaced by a scaled identity matrix (1-Ɣ)I, their top-K rankings will not be affected much”. (5) We propose a novel method that can accurately convert from Li et al.  SimRank ~{S} to Jeh and Widom’s SimRank S . (6) We propose GSR # , a generalisation of our “cosine-based” SimRank model, to quantify pairwise similarities across two distinct graphs, unlike SimRank that would assess nodes across two graphs as completely dissimilar. Extensive experiments on various datasets demonstrate the superiority of our proposed approaches in terms of high search quality, computational efficiency, accuracy, and scalability on billion-edge graphs.


2018 ◽  
Vol 44 (5) ◽  
pp. E3 ◽  
Author(s):  
Spencer Twitchell ◽  
Hussam Abou-Al-Shaar ◽  
Jared Reese ◽  
Michael Karsy ◽  
Ilyas M. Eli ◽  
...  

OBJECTIVEWith the continuous rise of health care costs, hospitals and health care providers must find ways to reduce costs while maintaining high-quality care. Comparing surgical and endovascular treatment of intracranial aneurysms may offer direction in reducing health care costs. The Value-Driven Outcomes (VDO) database at the University of Utah identifies cost drivers and tracks changes over time. In this study, the authors evaluate specific cost drivers for surgical clipping and endovascular management (i.e., coil embolization and flow diversion) of both ruptured and unruptured intracranial aneurysms using the VDO system.METHODSThe authors retrospectively reviewed surgical and endovascular treatment of ruptured and unruptured intracranial aneurysms from July 2011 to January 2017. Total cost (as a percentage of each patient’s cost to the system), subcategory costs, and potential cost drivers were evaluated and analyzed.RESULTSA total of 514 aneurysms in 469 patients were treated; 273 aneurysms were surgically clipped, 102 were repaired with coiling, and 139 were addressed with flow diverter placements. Middle cerebral artery aneurysms accounted for the largest portion of cases in the clipping group (29.7%), whereas anterior communicating artery aneurysms were most frequently involved in the coiling group (30.4%) and internal carotid artery aneurysms were the majority in the flow diverter group (63.3%). Coiling (mean total cost 0.25% ± 0.20%) had a higher cost than flow diversion (mean 0.20% ± 0.16%) and clipping (mean 0.17 ± 0.14%; p = 0.0001, 1-way ANOVA). Coiling cases cost 1.5 times as much as clipping and flow diversion costs 1.2 times as much as clipping. Facility costs were the most significant contributor to intracranial clipping costs (60.2%), followed by supplies (18.3%). Supplies were the greatest cost contributor to coiling costs (43.2%), followed by facility (40.0%); similarly, supplies were the greatest portion of costs in flow diversion (57.5%), followed by facility (28.5%). Cost differences for aneurysm location, rupture status, American Society of Anesthesiologists (ASA) grade, and discharge disposition could be identified, with variability depending on surgical procedure. A multivariate analysis showed that rupture status, surgical procedure type, ASA status, discharge disposition, and year of surgery all significantly affected cost (p < 0.0001).CONCLUSIONSFacility utilization and supplies constitute the majority of total costs in aneurysm treatment strategies, but significant variation exists depending on surgical approach, rupture status, and patient discharge disposition. Developing and implementing approaches and protocols to improve resource utilization are important in reducing costs while maintaining high-quality patient care.


Author(s):  
Sambit Kumar Mishra ◽  
Bibhudatta Sahoo ◽  
Kshira Sagar Sahoo ◽  
Sanjay Kumar Jena

The service (task) allocation problem in the distributed computing is one form of multidimensional knapsack problem which is one of the best examples of the combinatorial optimization problem. Nature-inspired techniques represent powerful mechanisms for addressing a large number of combinatorial optimization problems. Computation of getting an optimal solution for various industrial and scientific problems is usually intractable. The service request allocation problem in distributed computing belongs to a particular group of problems, i.e., NP-hard problem. The major portion of this chapter constitutes a survey of various mechanisms for service allocation problem with the availability of different cloud computing architecture. Here, there is a brief discussion towards the implementation issues of various metaheuristic techniques like Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Ant Colony Optimization (ACO), BAT algorithm, etc. with various environments for the service allocation problem in the cloud.


2020 ◽  
Vol 31 (2) ◽  
pp. 473-490 ◽  
Author(s):  
Abhijeet Ghoshal ◽  
Jing Hao ◽  
Syam Menon ◽  
Sumit Sarkar

Although retailers recognize the potential value of sharing transactional data with supply chain partners, many remain reluctant to share. However, there is evidence that the extent of sharing would be greater if information sensitive to retailers can be concealed before sharing. Extant research has only considered sensitive information at the organizational level. This is rarely the case in reality; the retail industry has adapted their offerings to region-wide differences in customer tastes for decades. Differences in customer characteristics across regions lead to region-specific sensitive information in addition to any at the organizational level. This is the first paper to propose an approach to solve this version of the problem. Region-level requirements increase the size of an already difficult (NP-hard) problem substantially, making adaptations of existing approaches impractical. We present an ensemble approach that draws intuition from Lagrangian relaxation to conceal sensitive patterns at the organizational and regional levels with minimal damage to the data set. Extensive computational experiments show that it identifies optimal or near-optimal solutions even when other approaches fail, doing so without any loss in recommendation effectiveness. This mitigates potential risks associated with sharing and should increase data sharing among partners in the supply chain.


Sign in / Sign up

Export Citation Format

Share Document