A New Learning Algorithm for Mean Field Boltzmann Machines

Author(s):  
Max Welling ◽  
Geoffrey E. Hinton
1994 ◽  
Vol 6 (6) ◽  
pp. 1174-1184 ◽  
Author(s):  
Lawrence Saul ◽  
Michael I. Jordan

We introduce a large family of Boltzmann machines that can be trained by standard gradient descent. The networks can have one or more layers of hidden units, with tree-like connectivity. We show how to implement the supervised learning algorithm for these Boltzmann machines exactly, without resort to simulated or mean-field annealing. The stochastic averages that yield the gradients in weight space are computed by the technique of decimation. We present results on the problems of N-bit parity and the detection of hidden symmetries.


1998 ◽  
Vol 10 (5) ◽  
pp. 1137-1156 ◽  
Author(s):  
H. J. Kappen ◽  
F. B. Rodríguez

The learning process in Boltzmann machines is computationally very expensive. The computational complexity of the exact algorithm is exponential in the number of neurons. We present a new approximate learning algorithm for Boltzmann machines, based on mean-field theory and the linear response theorem. The computational complexity of the algorithm is cubic in the number of neurons. In the absence of hidden units, we show how the weights can be directly computed from the fixed-point equation of the learning rules. Thus, in this case we do not need to use a gradient descent procedure for the learning process. We show that the solutions of this method are close to the optimal solutions and give a significant improvement when correlations play a significant role. Finally, we apply the method to a pattern completion task and show good performance for networks up to 100 neurons.


2009 ◽  
Vol 21 (11) ◽  
pp. 3130-3178 ◽  
Author(s):  
Muneki Yasuda ◽  
Kazuyuki Tanaka

Boltzmann machines can be regarded as Markov random fields. For binary cases, they are equivalent to the Ising spin model in statistical mechanics. Learning systems in Boltzmann machines are one of the NP-hard problems. Thus, in general we have to use approximate methods to construct practical learning algorithms in this context. In this letter, we propose new and practical learning algorithms for Boltzmann machines by using the belief propagation algorithm and the linear response approximation, which are often referred as advanced mean field methods. Finally, we show the validity of our algorithm using numerical experiments.


Author(s):  
Da Teng ◽  
Zhang Li ◽  
Guanghong Gong ◽  
Liang Han

The original restricted Boltzmann machines (RBMs) are extended by replacing the binary visible and hidden variables with clusters of binary units, and a new learning algorithm for training deep Boltzmann machine of this new variant is proposed. The sum of binary units of each cluster is approximated by a Gaussian distribution. Experiments demonstrate that the proposed Boltzmann machines can achieve good performance in the MNIST handwritten digital recognition task.


2008 ◽  
Vol 19 (1) ◽  
pp. 91-113 ◽  
Author(s):  
Vivek S. Borkar ◽  
Jervis Pinto ◽  
Tarun Prabhu

2009 ◽  
Vol 72 (16-18) ◽  
pp. 3771-3781 ◽  
Author(s):  
R. Savitha ◽  
S. Suresh ◽  
N. Sundararajan ◽  
P. Saratchandran

2002 ◽  
Vol 14 (7) ◽  
pp. 1561-1573 ◽  
Author(s):  
Marc M. Van Hulle

We introduce a new learning algorithm for kernel-based topographic map formation. The algorithm generates a gaussian mixture density model by individually adapting the gaussian kernels' centers and radii to the assumed gaussian local input densities.


2003 ◽  
Vol 10 (4) ◽  
pp. 399-409 ◽  
Author(s):  
Peter Otto ◽  
Yevgeniy Bodyanskiy ◽  
Vitaliy Kolodyazhniy

Sign in / Sign up

Export Citation Format

Share Document