boltzmann machines
Recently Published Documents


TOTAL DOCUMENTS

608
(FIVE YEARS 158)

H-INDEX

39
(FIVE YEARS 8)

Author(s):  
Eric Zou ◽  
Erik Long ◽  
Erhai Zhao

Abstract Neural network quantum states provide a novel representation of the many-body states of interacting quantum systems and open up a promising route to solve frustrated quantum spin models that evade other numerical approaches. Yet its capacity to describe complex magnetic orders with large unit cells has not been demonstrated, and its performance in a rugged energy landscape has been questioned. Here we apply restricted Boltzmann machines and stochastic gradient descent to seek the ground states of a compass spin model on the honeycomb lattice, which unifies the Kitaev model, Ising model and the quantum 120-degree model with a single tuning parameter. We report calculation results on the variational energy, order parameters and correlation functions. The phase diagram obtained is in good agreement with the predictions of tensor network ansatz, demonstrating the capacity of restricted Boltzmann machines in learning the ground states of frustrated quantum spin Hamiltonians. The limitations of the calculation are discussed. A few strategies are outlined to address some of the challenges in machine learning frustrated quantum magnets.


Author(s):  
Vivek Saraswat ◽  
Udayan Ganguly

Abstract Emerging non-volatile memories have been proposed for a wide range of applications, from easing the von-Neumann bottleneck to neuromorphic applications. Specifically, scalable RRAMs based on Pr1-xCaxMnO3 (PCMO) exhibit analog switching have been demonstrated as an integrating neuron, an analog synapse, and a voltage-controlled oscillator. More recently, the inherent stochasticity of memristors has been proposed for efficient hardware implementations of Boltzmann Machines. However, as the problem size scales, the number of neurons increases and controlling the stochastic distribution tightly over many iterations is necessary. This requires parametric control over stochasticity. Here, we characterize the stochastic Set in PCMO RRAMs. We identify that the Set time distribution depends on the internal state of the device (i.e., resistance) in addition to external input (i.e., voltage pulse). This requires the confluence of contradictory properties like stochastic switching as well as deterministic state control in the same device. Unlike ‘stochastic-everywhere’ filamentary memristors, in PCMO RRAMs, we leverage the (i) stochastic Set in negative polarity and (ii) deterministic analog Reset in positive polarity to demonstrate 100× reduced Set time distribution drift. The impact on Boltzmann Machines’ performance is analyzed and as opposed to the “fixed external input stochasticity”, the “state-monitored stochasticity” can solve problems 20× larger in size. State monitoring also tunes out the device-to-device variability effect on distributions providing 10× better performance. In addition to the physical insights, this study establishes the use of experimental stochasticity in PCMO RRAMs in stochastic recurrent neural networks reliably over many iterations.


Author(s):  
Mohammadreza Noormandipour ◽  
Youran Sun ◽  
Babak Haghighat

Abstract In this work, the capability of restricted Boltzmann machines (RBMs) to find solutions for the Kitaev honeycomb model with periodic boundary conditions is investigated. The measured groundstate (GS) energy of the system is compared and, for small lattice sizes (e.g. 3×3 with 18 spinors), shown to agree with the analytically derived value of the energy up to a deviation of 0.09 %. Moreover, the wave-functions we find have 99.89 % overlap with the exact ground state wave-functions. Furthermore, the possibility of realizing anyons in the RBM is discussed and an algorithm is given to build these anyonic excitations and braid them for possible future applications in quantum computation. Using the correspondence between topological field theories in (2+1)d and 2d CFTs, we propose an identification between our RBM states with the Moore-Read state and conformal blocks of the 2 d Ising model.


2021 ◽  
Vol 104 (20) ◽  
Author(s):  
Douglas Hendry ◽  
Hongwei Chen ◽  
Phillip Weinberg ◽  
Adrian E. Feiguin

2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Rong Dai

The special text has a lot of features, such as professional words, abbreviations, large datasets, different themes, and uneven distribution of labels. While the existing text data mining classification methods use simple machine learning models, it has a bad performance on text classification. To solve this drawback, a text data mining algorithm based on convolutional neural network (CNN) model and deep Boltzmann machines (DBM) model is proposed in this paper. This method combines the CNN and DBM models with good feature extraction to realize the double feature extraction. It can realize the tag tree by constructing the tag tree and design the effective hierarchical network to achieve classification. At the same time, the model can suppress the input noise on the classification. Experimental results show that the improved algorithm achieves good classification results in special text data mining.


Author(s):  
Elena Agliari ◽  
Linda Albanese ◽  
Francesco Alemanno ◽  
Alberto Fachechi

Abstract We consider a multi-layer Sherrington-Kirkpatrick spin-glass as a model for deep restricted Boltzmann machines with quenched random weights and solve for its free energy in the thermodynamic limit by means of Guerra's interpolating techniques under the RS and 1RSB ansatz. In particular, we recover the expression already known for the replica-symmetric case. Further, we drop the restriction constraint by introducing intra-layer connections among spins and we show that the resulting system can be mapped into a modular Hopfield network, which is also addressed via the same techniques up to the first step of replica symmetry breaking.


2021 ◽  
Vol 2122 (1) ◽  
pp. 012005
Author(s):  
M.A. Novotný ◽  
Yaroslav Koshka ◽  
G. Inkoonv ◽  
Vivek Dixit

Abstract Design and examples of a sixty-four bit quantum dragon data-set are presented. A quantum dragon is a tight-binding model for a strongly disordered nanodevice, but when connected to appropriate semi-infinite leads has complete electron transmission for a finite interval of energies. The labeled data-set contains records which are quantum dragons, which are not quantum dragons, and which are indeterminate. The quantum dragon data-set is designed to be difficult for trained humans and machines to label a nanodevice with regard to its quantum dragon property. The 64 bit record length allows the data-set to be utilized in restricted Boltzmann machines which fit well onto the D-Wave 2000Q quantum annealer architecture.


2021 ◽  
Vol 2122 (1) ◽  
pp. 012007
Author(s):  
Vivek Dixit ◽  
Yaroslav Koshka ◽  
Tamer Aldwairi ◽  
M.A. Novotny

Abstract Classification and data reconstruction using a restricted Boltzmann machine (RBM) is presented. RBM is an energy-based model which assigns low energy values to the configurations of interest. It is a generative model, once trained it can be used to produce samples from the target distribution. The D-Wave 2000Q is a quantum computer which has been used to exploit its quantum effect for machine learning. Bars-and-stripes (BAS) and cybersecurity (ISCX) datasets were used to train RBMs. The weights and biases of trained RBMs were used to map onto the D-Wave. Classification as well as image reconstruction were performed. Classification accuracy of both datasets indicates comparable performance using D-Wave’s adiabatic annealing and classical Gibb’s sampling.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Anna Paola Muntoni ◽  
Andrea Pagnani ◽  
Martin Weigt ◽  
Francesco Zamponi

Abstract Background Boltzmann machines are energy-based models that have been shown to provide an accurate statistical description of domains of evolutionary-related protein and RNA families. They are parametrized in terms of local biases accounting for residue conservation, and pairwise terms to model epistatic coevolution between residues. From the model parameters, it is possible to extract an accurate prediction of the three-dimensional contact map of the target domain. More recently, the accuracy of these models has been also assessed in terms of their ability in predicting mutational effects and generating in silico functional sequences. Results Our adaptive implementation of Boltzmann machine learning, , can be generally applied to both protein and RNA families and accomplishes several learning set-ups, depending on the complexity of the input data and on the user requirements. The code is fully available at https://github.com/anna-pa-m/adabmDCA. As an example, we have performed the learning of three Boltzmann machines modeling the Kunitz and Beta-lactamase2 protein domains and TPP-riboswitch RNA domain. Conclusions The models learned by are comparable to those obtained by state-of-the-art techniques for this task, in terms of the quality of the inferred contact map as well as of the synthetically generated sequences. In addition, the code implements both equilibrium and out-of-equilibrium learning, which allows for an accurate and lossless training when the equilibrium one is prohibitive in terms of computational time, and allows for pruning irrelevant parameters using an information-based criterion.


Sign in / Sign up

Export Citation Format

Share Document