scholarly journals Inverse problems for structured datasets using parallel TAP equations and restricted Boltzmann machines

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Aurelien Decelle ◽  
Sungmin Hwang ◽  
Jacopo Rocchi ◽  
Daniele Tantari

AbstractWe propose an efficient algorithm to solve inverse problems in the presence of binary clustered datasets. We consider the paradigmatic Hopfield model in a teacher student scenario, where this situation is found in the retrieval phase. This problem has been widely analyzed through various methods such as mean-field approaches or the pseudo-likelihood optimization. Our approach is based on the estimation of the posterior using the Thouless–Anderson–Palmer (TAP) equations in a parallel updating scheme. Unlike other methods, it allows to retrieve the original patterns of the teacher dataset and thanks to the parallel update it can be applied to large system sizes. We tackle the same problem using a restricted Boltzmann machine (RBM) and discuss analogies and differences between our algorithm and RBM learning.

Author(s):  
Harald Hruschka

AbstractWe introduce the conditional restricted Boltzmann machine as method to analyze brand-level market basket data of individual households. The conditional restricted Boltzmann machine includes marketing variables and household attributes as independent variables. To our knowledge this is the first study comparing the conditional restricted Boltzmann machine to homogeneous and heterogeneous multivariate logit models for brand-level market basket data across several product categories. We explain how to estimate the conditional restricted Boltzmann machine starting from a restricted Boltzmann machine without independent variables. The conditional restricted Boltzmann machine turns out to excel all the other investigated models in terms of log pseudo-likelihood for holdout data. We interpret the selected conditional restricted Boltzmann machine based on coefficients linking purchases to hidden variables, interdependences between brand pairs as well as own and cross effects of marketing variables. The conditional restricted Boltzmann machine indicates pairwise relationships between brands that are more varied than those of the multivariate logit model are. Based on the pairwise interdependences inferred from the restricted Boltzmann machine we determine the competitive structure of brands by means of cluster analysis. Using counterfactual simulations, we investigate what three different models (independent logit, heterogeneous multivariate logit, conditional restricted Boltzmann machine) imply with respect to the retailer’s revenue if each brand is put on display. Finally, we mention possibilities for further research, such as applying the conditional restricted Boltzmann machine to other areas in marketing or retailing.


2018 ◽  
Vol 18 (1&2) ◽  
pp. 51-74 ◽  
Author(s):  
Daniel Crawford ◽  
Anna Levit ◽  
Navid Ghadermarzy ◽  
Jaspreet S. Oberoi ◽  
Pooya Ronagh

We investigate whether quantum annealers with select chip layouts can outperform classical computers in reinforcement learning tasks. We associate a transverse field Ising spin Hamiltonian with a layout of qubits similar to that of a deep Boltzmann machine (DBM) and use simulated quantum annealing (SQA) to numerically simulate quantum sampling from this system. We design a reinforcement learning algorithm in which the set of visible nodes representing the states and actions of an optimal policy are the first and last layers of the deep network. In absence of a transverse field, our simulations show that DBMs are trained more effectively than restricted Boltzmann machines (RBM) with the same number of nodes. We then develop a framework for training the network as a quantum Boltzmann machine (QBM) in the presence of a significant transverse field for reinforcement learning. This method also outperforms the reinforcement learning method that uses RBMs.


Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 34
Author(s):  
Chiara Marullo ◽  
Elena Agliari

The Hopfield model and the Boltzmann machine are among the most popular examples of neural networks. The latter, widely used for classification and feature detection, is able to efficiently learn a generative model from observed data and constitutes the benchmark for statistical learning. The former, designed to mimic the retrieval phase of an artificial associative memory lays in between two paradigmatic statistical mechanics models, namely the Curie-Weiss and the Sherrington-Kirkpatrick, which are recovered as the limiting cases of, respectively, one and many stored memories. Interestingly, the Boltzmann machine and the Hopfield network, if considered to be two cognitive processes (learning and information retrieval), are nothing more than two sides of the same coin. In fact, it is possible to exactly map the one into the other. We will inspect such an equivalence retracing the most representative steps of the research in this field.


2018 ◽  
Vol 9 (1) ◽  
pp. 96 ◽  
Author(s):  
Soojeong Lee ◽  
Joon-Hyuk Chang

We propose a technique using Dempster–Shafer fusion based on a deep Boltzmann machine to classify and estimate systolic blood pressure and diastolic blood pressure categories using oscillometric blood pressure measurements. The deep Boltzmann machine is a state-of-the-art technology in which multiple restricted Boltzmann machines are accumulated. Unlike deep belief networks, each unit in the middle layer of the deep Boltzmann machine obtain information up and down to prevent uncertainty at the inference step. Dempster–Shafer fusion can be incorporated to enable combined independent estimation of the observations, and a confidence increase for a given deep Boltzmann machine estimate can be clearly observed. Our work provides an accurate blood pressure estimate, a blood pressure category with upper and lower bounds, and a solution that can reduce estimation uncertainty. This study is one of the first to use deep Boltzmann machine-based Dempster–Shafer fusion to classify and estimate blood pressure.


Author(s):  
Mohammadreza Noormandipour ◽  
Youran Sun ◽  
Babak Haghighat

Abstract In this work, the capability of restricted Boltzmann machines (RBMs) to find solutions for the Kitaev honeycomb model with periodic boundary conditions is investigated. The measured groundstate (GS) energy of the system is compared and, for small lattice sizes (e.g. 3×3 with 18 spinors), shown to agree with the analytically derived value of the energy up to a deviation of 0.09 %. Moreover, the wave-functions we find have 99.89 % overlap with the exact ground state wave-functions. Furthermore, the possibility of realizing anyons in the RBM is discussed and an algorithm is given to build these anyonic excitations and braid them for possible future applications in quantum computation. Using the correspondence between topological field theories in (2+1)d and 2d CFTs, we propose an identification between our RBM states with the Moore-Read state and conformal blocks of the 2 d Ising model.


Author(s):  
Da Teng ◽  
Zhang Li ◽  
Guanghong Gong ◽  
Liang Han

The original restricted Boltzmann machines (RBMs) are extended by replacing the binary visible and hidden variables with clusters of binary units, and a new learning algorithm for training deep Boltzmann machine of this new variant is proposed. The sum of binary units of each cluster is approximated by a Gaussian distribution. Experiments demonstrate that the proposed Boltzmann machines can achieve good performance in the MNIST handwritten digital recognition task.


2021 ◽  
Vol 2122 (1) ◽  
pp. 012007
Author(s):  
Vivek Dixit ◽  
Yaroslav Koshka ◽  
Tamer Aldwairi ◽  
M.A. Novotny

Abstract Classification and data reconstruction using a restricted Boltzmann machine (RBM) is presented. RBM is an energy-based model which assigns low energy values to the configurations of interest. It is a generative model, once trained it can be used to produce samples from the target distribution. The D-Wave 2000Q is a quantum computer which has been used to exploit its quantum effect for machine learning. Bars-and-stripes (BAS) and cybersecurity (ISCX) datasets were used to train RBMs. The weights and biases of trained RBMs were used to map onto the D-Wave. Classification as well as image reconstruction were performed. Classification accuracy of both datasets indicates comparable performance using D-Wave’s adiabatic annealing and classical Gibb’s sampling.


Sign in / Sign up

Export Citation Format

Share Document