scholarly journals Neural network operations and Susuki–Trotter evolution of neural network states

2018 ◽  
Vol 16 (08) ◽  
pp. 1840008 ◽  
Author(s):  
Nahuel Freitas ◽  
Giovanna Morigi ◽  
Vedran Dunjko

It was recently proposed to leverage the representational power of artificial neural networks, in particular Restricted Boltzmann Machines, in order to model complex quantum states of many-body systems [G. Carleo and M. Troyer, Science 355(6325) (2017) 602.]. States represented in this way, called Neural Network States (NNSs), were shown to display interesting properties like the ability to efficiently capture long-range quantum correlations. However, identifying an optimal neural network representation of a given state might be challenging, and so far this problem has been addressed with stöchastic optimization techniques. In this work, we explore a different direction. We study how the action of elementary quantum operations modifies NNSs. We parametrize a family of many body quantum operations that can be directly applied to states represented by Unrestricted Boltzmann Machines, by just adding hidden nodes and updating the network parameters. We show that this parametrization contains a set of universal quantum gates, from which it follows that the state prepared by any quantum circuit can be expressed as a Neural Network State with a number of hidden nodes that grows linearly with the number of elementary operations in the circuit. This is a powerful representation theorem (which was recently obtained with different methods) but that is not directly useful, since there is no general and efficient way to extract information from this unrestricted description of quantum states. To circumvent this problem, we propose a step-wise procedure based on the projection of Unrestricted quantum states to Restricted quantum states. In turn, two approximate methods to perform this projection are discussed. In this way, we show that it is in principle possible to approximately optimize or evolve Neural Network States without relying on stochastic methods such as Variational Monte Carlo, which are computationally expensive.

Author(s):  
Eric Zou ◽  
Erik Long ◽  
Erhai Zhao

Abstract Neural network quantum states provide a novel representation of the many-body states of interacting quantum systems and open up a promising route to solve frustrated quantum spin models that evade other numerical approaches. Yet its capacity to describe complex magnetic orders with large unit cells has not been demonstrated, and its performance in a rugged energy landscape has been questioned. Here we apply restricted Boltzmann machines and stochastic gradient descent to seek the ground states of a compass spin model on the honeycomb lattice, which unifies the Kitaev model, Ising model and the quantum 120-degree model with a single tuning parameter. We report calculation results on the variational energy, order parameters and correlation functions. The phase diagram obtained is in good agreement with the predictions of tensor network ansatz, demonstrating the capacity of restricted Boltzmann machines in learning the ground states of frustrated quantum spin Hamiltonians. The limitations of the calculation are discussed. A few strategies are outlined to address some of the challenges in machine learning frustrated quantum magnets.


2021 ◽  
Author(s):  
Ying Yang ◽  
Huaixin Cao

Abstract With the rapid development of machine learning, artificial neural networks provide a powerful tool to represent or approximate many-body quantum states. It was proved that every graph state can be generated by a neural network. In this paper, we aim to introduce digraph states and explore their neural network representations (NNRs). Based on some discussions about digraph states and neural network quantum states (NNQSs), we construct explicitly the NNR for any digraph state, implying every digraph state is an NNQS. The obtained results will provide a theoretical foundation for solving the quantum many-body problem with machine learning method whenever the wave-function is known as an unknown digraph state or it can be approximated by digraph states.


2018 ◽  
Vol 121 (16) ◽  
Author(s):  
Kenny Choo ◽  
Giuseppe Carleo ◽  
Nicolas Regnault ◽  
Titus Neupert

Author(s):  
Ying Yang ◽  
Chengyang Zhang ◽  
Huaixin Cao

The many-body problem in quantum physics originates from the difficulty of describing the non-trivial correlations encoded in the exponential complexity of the many-body wave function. Motivated by the Giuseppe Carleo's work titled solving the quantum many-body problem with artificial neural networks [Science, 2017, 355: 602], we focus on finding the NNQS approximation of the unknown ground state of a given Hamiltonian $H$ in terms of the best relative error and explore the influences of sum, tensor product, local unitary of Hamiltonians on the best relative error. Besides, we illustrate our method with some examples.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Johannes Borregaard ◽  
Matthias Christandl ◽  
Daniel Stilck França

AbstractWe describe a resource-efficient approach to studying many-body quantum states on noisy, intermediate-scale quantum devices. We employ a sequential generation model that allows us to bound the range of correlations in the resulting many-body quantum states. From this, we characterize situations where the estimation of local observables does not require the preparation of the entire state. Instead smaller patches of the state can be generated from which the observables can be estimated. This can potentially reduce circuit size and number of qubits for the computation of physical properties of the states. Moreover, we show that the effect of noise decreases along the computation. Our results apply to a broad class of widely studied tensor network states and can be directly applied to near-term implementations of variational quantum algorithms.


2016 ◽  
Vol 7 (2) ◽  
pp. 105-112
Author(s):  
Adhi Kusnadi ◽  
Idul Putra

Stress will definitely be experienced by every human being and the level of stress experienced by each individual is different. Stress experienced by students certainly will disturb their study if it is not handled quickly and appropriately. Therefore we have created an expert system using a neural network backpropagation algorithm to help counselors to predict the stress level of students. The network structure of the experiment consists of 26 input nodes, 5 hidden nodes, and 2 the output nodes, learning rate of 0.1, momentum of 0.1, and epoch of 5000, with a 100% accuracy rate. Index Terms - Stress on study, expert system, neural network, Stress Prediction


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Guanglei Xu ◽  
William S. Oates

AbstractRestricted Boltzmann Machines (RBMs) have been proposed for developing neural networks for a variety of unsupervised machine learning applications such as image recognition, drug discovery, and materials design. The Boltzmann probability distribution is used as a model to identify network parameters by optimizing the likelihood of predicting an output given hidden states trained on available data. Training such networks often requires sampling over a large probability space that must be approximated during gradient based optimization. Quantum annealing has been proposed as a means to search this space more efficiently which has been experimentally investigated on D-Wave hardware. D-Wave implementation requires selection of an effective inverse temperature or hyperparameter ($$\beta $$ β ) within the Boltzmann distribution which can strongly influence optimization. Here, we show how this parameter can be estimated as a hyperparameter applied to D-Wave hardware during neural network training by maximizing the likelihood or minimizing the Shannon entropy. We find both methods improve training RBMs based upon D-Wave hardware experimental validation on an image recognition problem. Neural network image reconstruction errors are evaluated using Bayesian uncertainty analysis which illustrate more than an order magnitude lower image reconstruction error using the maximum likelihood over manually optimizing the hyperparameter. The maximum likelihood method is also shown to out-perform minimizing the Shannon entropy for image reconstruction.


Sign in / Sign up

Export Citation Format

Share Document