deep boltzmann machine
Recently Published Documents


TOTAL DOCUMENTS

64
(FIVE YEARS 26)

H-INDEX

10
(FIVE YEARS 3)

2021 ◽  
Vol 17 (10) ◽  
pp. e1009458
Author(s):  
Carolin Scholl ◽  
Michael E. Rule ◽  
Matthias H. Hennig

During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Haik Manukian ◽  
Massimiliano Di Ventra

AbstractThe deep extension of the restricted Boltzmann machine (RBM), known as the deep Boltzmann machine (DBM), is an expressive family of machine learning models which can serve as compact representations of complex probability distributions. However, jointly training DBMs in the unsupervised setting has proven to be a formidable task. A recent technique we have proposed, called mode-assisted training, has shown great success in improving the unsupervised training of RBMs. Here, we show that the performance gains of the mode-assisted training are even more dramatic for DBMs. In fact, DBMs jointly trained with the mode-assisted algorithm can represent the same data set with orders of magnitude lower number of total parameters compared to state-of-the-art training procedures and even with respect to RBMs, provided a fan-in network topology is also introduced. This substantial saving in number of parameters makes this training method very appealing also for hardware implementations.


Author(s):  
Diego Alberici ◽  
Francesco Camilli ◽  
Pierluigi Contucci ◽  
Emanuele Mingione

AbstractThe deep Boltzmann machine on the Nishimori line with a finite number of layers is exactly solved by a theorem that expresses its pressure through a finite dimensional variational problem of min–max type. In the absence of magnetic fields the order parameter is shown to exhibit a phase transition whose dependence on the geometry of the system is investigated.


Author(s):  
Fuyun He ◽  
Xiaoming Huang ◽  
Xun Wang ◽  
Senhui Qiu ◽  
F. Jiang ◽  
...  

Author(s):  
Khalid Raza ◽  
Nripendra Kumar Singh

Background: Interpretation of medical images for the diagnosis and treatment of complex diseases from high-dimensional and heterogeneous data remains a key challenge in transforming healthcare. In the last few years, both supervised and unsupervised deep learning achieved promising results in the area of medical image analysis. Several reviews on supervised deep learning are published, but hardly any rigorous review on unsupervised deep learning for medical image analysis is available. Objectives: The objective of this review is to systematically present various unsupervised deep learning models, tools, and benchmark datasets applied to medical image analysis. Some of the discussed models are autoencoders and its other variants, Restricted Boltzmann machines (RBM), Deep belief networks (DBN), Deep Boltzmann machine (DBM), and Generative adversarial network (GAN). Further, future research opportunities and challenges of unsupervised deep learning techniques for medical image analysis are also discussed. Conclusion: Currently, interpretation of medical images for diagnostic purposes is usually performed by human experts that may be replaced by computer-aided diagnosis due to advancement in machine learning techniques, including deep learning, and the availability of cheap computing infrastructure through cloud computing. Both supervised and unsupervised machine learning approaches are widely applied in medical image analysis, each of them having certain pros and cons. Since human supervisions are not always available or inadequate or biased, therefore, unsupervised learning algorithms give a big hope with lots of advantages for biomedical image analysis.


Author(s):  
J. Nageswara Rao ◽  
R. Satya Prasad

Nowadays heart disease becomes more complicated to every human being. Machine Learning and Deep Learning plays the major role in processing the automatic systems. Prediction of heart disease is most difficult task because many algorithms perform limited operations. The aim of the paper is to increase the accuracy and prediction values. Various heart disease datasets are available for the research. Deep Learning (DL) algorithms play the major role in prediction of heart disease. Prediction can be done in the early stages to reduce the risk of death for the humans. In this paper, An Ensemble Deep Dynamic Algorithm (EDDA) is introduced to increase the accuracy of prediction values. The EDDA follows the some steps to process the prediction of heart disease. The steps are as follows: Linear Regression and Deep Boltzmann Machine (DBM) is applied on the selected dataset. Performance is calculated in terms of sensitivity, specificity and accuracy are shown with the comparative results.


2020 ◽  
Author(s):  
Carolin Scholl ◽  
Michael E. Rule ◽  
Matthias H. Hennig

AbstractDuring development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural network. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available proxy for “sloppiness” based on Fisher Information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.


Sign in / Sign up

Export Citation Format

Share Document