scholarly journals A Quantum Approximate Optimization Algorithm with Metalearning for MaxCut Problem and Its Simulation via TensorFlow Quantum

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Haibin Wang ◽  
Jiaojiao Zhao ◽  
Bosi Wang ◽  
Lian Tong

A quantum approximate optimization algorithm (QAOA) is a polynomial-time approximate optimization algorithm used to solve combinatorial optimization problems. However, the existing QAOA algorithms have poor generalization performance in finding an optimal solution from a feasible solution set of combinatorial problems. In order to solve this problem, a quantum approximate optimization algorithm with metalearning for the MaxCut problem (MetaQAOA) is proposed. Specifically, a quantum neural network (QNN) is constructed in the form of the parameterized quantum circuit to detect different topological phases of matter, and a classical long short-term memory (LSTM) neural network is used as a black-box optimizer, which can quickly assist QNN to find the approximate optimal QAOA parameters. The experiment simulation via TensorFlow Quantum (TFQ) shows that MetaQAOA requires fewer iterations to reach the threshold of the loss function, and the threshold of the loss value after training is smaller than comparison methods. In addition, our algorithm can learn parameter update heuristics which can generalize to larger system sizes and still outperform other initialization strategies of this scale.

Author(s):  
Laurens Bliek ◽  
Sicco Verwer ◽  
Mathijs de Weerdt

Abstract When a black-box optimization objective can only be evaluated with costly or noisy measurements, most standard optimization algorithms are unsuited to find the optimal solution. Specialized algorithms that deal with exactly this situation make use of surrogate models. These models are usually continuous and smooth, which is beneficial for continuous optimization problems, but not necessarily for combinatorial problems. However, by choosing the basis functions of the surrogate model in a certain way, we show that it can be guaranteed that the optimal solution of the surrogate model is integer. This approach outperforms random search, simulated annealing and a Bayesian optimization algorithm on the problem of finding robust routes for a noise-perturbed traveling salesman benchmark problem, with similar performance as another Bayesian optimization algorithm, and outperforms all compared algorithms on a convex binary optimization problem with a large number of variables.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Z. Fahimi ◽  
M. R. Mahmoodi ◽  
H. Nili ◽  
Valentin Polishchuk ◽  
D. B. Strukov

AbstractThe increasing utility of specialized circuits and growing applications of optimization call for the development of efficient hardware accelerator for solving optimization problems. Hopfield neural network is a promising approach for solving combinatorial optimization problems due to the recent demonstrations of efficient mixed-signal implementation based on emerging non-volatile memory devices. Such mixed-signal accelerators also enable very efficient implementation of various annealing techniques, which are essential for finding optimal solutions. Here we propose a “weight annealing” approach, whose main idea is to ease convergence to the global minima by keeping the network close to its ground state. This is achieved by initially setting all synaptic weights to zero, thus ensuring a quick transition of the Hopfield network to its trivial global minima state and then gradually introducing weights during the annealing process. The extensive numerical simulations show that our approach leads to a better, on average, solutions for several representative combinatorial problems compared to prior Hopfield neural network solvers with chaotic or stochastic annealing. As a proof of concept, a 13-node graph partitioning problem and a 7-node maximum-weight independent set problem are solved experimentally using mixed-signal circuits based on, correspondingly, a 20 × 20 analog-grade TiO2 memristive crossbar and a 12 × 10 eFlash memory array.


2020 ◽  
Vol 10 (20) ◽  
pp. 7116
Author(s):  
Jaeho Choi ◽  
Seunghyeok Oh ◽  
Joongheon Kim

This paper proposes an application algorithm based on a quantum approximate optimization algorithm (QAOA) for wireless scheduling problems. QAOA is one of the promising hybrid quantum-classical algorithms to solve combinatorial optimization problems and it provides great approximate solutions to non-deterministic polynomial-time (NP) hard problems. QAOA maps the given problem into Hilbert space, and then it generates the Hamiltonian for the given objective and constraint. Then, QAOA finds proper parameters from the classical optimization loop in order to optimize the expectation value of the generated Hamiltonian. Based on the parameters, the optimal solution to the given problem can be obtained from the optimum of the expectation value of the Hamiltonian. Inspired by QAOA, a quantum approximate optimization for scheduling (QAOS) algorithm is proposed. The proposed QAOS designs the Hamiltonian of the wireless scheduling problem which is formulated by the maximum weight independent set (MWIS). The designed Hamiltonian is converted into a unitary operator and implemented as a quantum gate operation. After that, the iterative QAOS sequence solves the wireless scheduling problem. The novelty of QAOS is verified with simulation results implemented via Cirq and TensorFlow-Quantum.


2014 ◽  
Vol 8 (1) ◽  
pp. 723-728 ◽  
Author(s):  
Chenhao Niu ◽  
Xiaomin Xu ◽  
Yan Lu ◽  
Mian Xing

Short time load forecasting is essential for daily planning and operation of electric power system. It is the important basis for economic dispatching, scheduling and safe operation. Neural network, which has strong nonlinear fitting capability, is widely used in the load forecasting and obtains good prediction effect in nonlinear chaotic time series forecasting. However, the neural network is easy to fall in local optimum, unable to find the global optimal solution. This paper will integrate the traditional optimization algorithm and propose the hybrid intelligent optimization algorithm based on particle swarm optimization algorithm and ant colony optimization algorithm (ACO-PSO) to improve the generalization of the neural network. In the empirical analysis, we select electricity consumption in a certain area for validation. Compared with the traditional BP neutral network and statistical methods, the experimental results demonstrate that the performance of the improved model with more precise results and stronger generalization ability is much better than the traditional methods.


2012 ◽  
Vol 215-216 ◽  
pp. 592-596
Author(s):  
Li Gao ◽  
Rong Rong Wang

In order to deal with complex product design optimization problems with both discrete and continuous variables, mix-variable collaborative design optimization algorithm is put forward based on collaborative optimization, which is an efficient way to solve mix-variable design optimization problems. On the rule of “divide and rule”, the algorithm decouples the problem into some relatively simple subsystems. Then by using collaborative mechanism, the optimal solution is obtained. Finally, the result of a case shows the feasibility and effectiveness of the new algorithm.


2016 ◽  
Vol 38 (4) ◽  
pp. 307-317
Author(s):  
Pham Hoang Anh

In this paper, the optimal sizing of truss structures is solved using a novel evolutionary-based optimization algorithm. The efficiency of the proposed method lies in the combination of global search and local search, in which the global move is applied for a set of random solutions whereas the local move is performed on the other solutions in the search population. Three truss sizing benchmark problems with discrete variables are used to examine the performance of the proposed algorithm. Objective functions of the optimization problems are minimum weights of the whole truss structures and constraints are stress in members and displacement at nodes. Here, the constraints and objective function are treated separately so that both function and constraint evaluations can be saved. The results show that the new algorithm can find optimal solution effectively and it is competitive with some recent metaheuristic algorithms in terms of number of structural analyses required.


Author(s):  
Zuo Dai ◽  
Jianzhong Cha

Abstract Artificial Neural Networks, particularly the Hopfield-Tank network, have been effectively applied to the solution of a variety of tasks formulated as large scale combinatorial optimization problems, such as Travelling Salesman Problem and N Queens Problem [1]. The problem of optimally packing a set of geometries into a space with finite dimensions arises frequently in many applications and is far difficult than general NP-complete problems listed in [2]. Until now within accepted time limit, it can only be solved with heuristic methods for very simple cases (e.g. 2D layout). In this paper we propose a heuristic-based Hopfield neural network designed to solve the rectangular packing problems in two dimensions, which is still NP-complete [3]. By comparing the adequacy and efficiency of the results with that obtained by several other exact and heuristic approaches, it has been concluded that the proposed method has great potential in solving 2D packing problems.


Author(s):  
Rizk M. Rizk-Allah ◽  
Aboul Ella Hassanien

This chapter presents a hybrid optimization algorithm namely FOA-FA for solving single and multi-objective optimization problems. The proposed algorithm integrates the benefits of the fruit fly optimization algorithm (FOA) and the firefly algorithm (FA) to avoid the entrapment in the local optima and the premature convergence of the population. FOA operates in the direction of seeking the optimum solution while the firefly algorithm (FA) has been used to accelerate the optimum seeking process and speed up the convergence performance to the global solution. Further, the multi-objective optimization problem is scalarized to a single objective problem by weighting method, where the proposed algorithm is implemented to derive the non-inferior solutions that are in contrast to the optimal solution. Finally, the proposed FOA-FA algorithm is tested on different benchmark problems whether single or multi-objective aspects and two engineering applications. The numerical comparisons reveal the robustness and effectiveness of the proposed algorithm.


Author(s):  
Arslan Ali Syed ◽  
Irina Gaponova ◽  
Klaus Bogenberger

The majority of transportation problems include optimizing some sort of cost function. These optimization problems are often NP-hard and have an exponential increase in computation time with the increase in the model size. The problem of matching vehicles to passenger requests in ride hailing (RH) contexts typically falls into this category.Metaheuristics are often utilized for such problems with the aim of finding a global optimal solution. However, such algorithms usually include lots of parameters that need to be tuned to obtain a good performance. Typically multiple simulations are run on diverse small size problems and the parameters values that perform the best on average are chosen for subsequent larger simulations.In contrast to the above approach, we propose training a neural network to predict the parameter values that work the best for an instance of the given problem. We show that various features, based on the problem instance and shareability graph statistics, can be used to predict the solution quality of a matching problem in RH services. Consequently, the values corresponding to the best predicted solution can be selected for the actual problem. We study the effectiveness of above described approach for the static assignment of vehicles to passengers in RH services. We utilized the DriveNow data from Bavarian Motor Works (BMW) for generating passenger requests inside Munich, and for the metaheuristic, we used a large neighborhood search (LNS) algorithm combined with a shareability graph.


2012 ◽  
Vol 433-440 ◽  
pp. 2808-2816
Author(s):  
Jian Jin Zheng ◽  
You Shen Xia

This paper presents a new interactive neural network for solving constrained multi-objective optimization problems. The constrained multi-objective optimization problem is reformulated into two constrained single objective optimization problems and two neural networks are designed to obtain the optimal weight and the optimal solution of the two optimization problems respectively. The proposed algorithm has a low computational complexity and is easy to be implemented. Moreover, the proposed algorithm is well applied to the design of digital filters. Computed results illustrate the good performance of the proposed algorithm.


Sign in / Sign up

Export Citation Format

Share Document