The nature of unsupervised learning in deep neural networks: A new understanding and novel approach

2016 ◽  
Vol 25 (3) ◽  
pp. 127-141 ◽  
Author(s):  
V. Golovko ◽  
A. Kroshchanka ◽  
D. Treadwell
2019 ◽  
Author(s):  
David Beniaguev ◽  
Idan Segev ◽  
Michael London

AbstractWe introduce a novel approach to study neurons as sophisticated I/O information processing units by utilizing recent advances in the field of machine learning. We trained deep neural networks (DNNs) to mimic the I/O behavior of a detailed nonlinear model of a layer 5 cortical pyramidal cell, receiving rich spatio-temporal patterns of input synapse activations. A Temporally Convolutional DNN (TCN) with seven layers was required to accurately, and very efficiently, capture the I/O of this neuron at the millisecond resolution. This complexity primarily arises from local NMDA-based nonlinear dendritic conductances. The weight matrices of the DNN provide new insights into the I/O function of cortical pyramidal neurons, and the approach presented can provide a systematic characterization of the functional complexity of different neuron types. Our results demonstrate that cortical neurons can be conceptualized as multi-layered “deep” processing units, implying that the cortical networks they form have a non-classical architecture and are potentially more computationally powerful than previously assumed.


2020 ◽  
Vol 34 (04) ◽  
pp. 4272-4279
Author(s):  
Ayush Jaiswal ◽  
Daniel Moyer ◽  
Greg Ver Steeg ◽  
Wael AbdAlmageed ◽  
Premkumar Natarajan

We propose a novel approach to achieving invariance for deep neural networks in the form of inducing amnesia to unwanted factors of data through a new adversarial forgetting mechanism. We show that the forgetting mechanism serves as an information-bottleneck, which is manipulated by the adversarial training to learn invariance to unwanted factors. Empirical results show that the proposed framework achieves state-of-the-art performance at learning invariance in both nuisance and bias settings on a diverse collection of datasets and tasks.


Sign in / Sign up

Export Citation Format

Share Document