Characterization of Deep Neural Networks Designed for Higher Spatial Resolution in Industrial Digital X-ray Images

2021 ◽  
Vol 41 (2) ◽  
pp. 87-93
Author(s):  
Seokwon Oh ◽  
Jinwoo Kim ◽  
Ho Kyung Kim
2020 ◽  
Vol 121 ◽  
pp. 103792 ◽  
Author(s):  
Tulin Ozturk ◽  
Muhammed Talo ◽  
Eylul Azra Yildirim ◽  
Ulas Baran Baloglu ◽  
Ozal Yildirim ◽  
...  

2019 ◽  
Author(s):  
David Beniaguev ◽  
Idan Segev ◽  
Michael London

AbstractWe introduce a novel approach to study neurons as sophisticated I/O information processing units by utilizing recent advances in the field of machine learning. We trained deep neural networks (DNNs) to mimic the I/O behavior of a detailed nonlinear model of a layer 5 cortical pyramidal cell, receiving rich spatio-temporal patterns of input synapse activations. A Temporally Convolutional DNN (TCN) with seven layers was required to accurately, and very efficiently, capture the I/O of this neuron at the millisecond resolution. This complexity primarily arises from local NMDA-based nonlinear dendritic conductances. The weight matrices of the DNN provide new insights into the I/O function of cortical pyramidal neurons, and the approach presented can provide a systematic characterization of the functional complexity of different neuron types. Our results demonstrate that cortical neurons can be conceptualized as multi-layered “deep” processing units, implying that the cortical networks they form have a non-classical architecture and are potentially more computationally powerful than previously assumed.


2021 ◽  
pp. 1-60
Author(s):  
Khashayar Filom ◽  
Roozbeh Farhoodi ◽  
Konrad Paul Kording

Abstract Neural networks are versatile tools for computation, having the ability to approximate a broad range of functions. An important problem in the theory of deep neural networks is expressivity; that is, we want to understand the functions that are computable by a given network. We study real, infinitely differentiable (smooth) hierarchical functions implemented by feedforward neural networks via composing simpler functions in two cases: (1) each constituent function of the composition has fewer in puts than the resulting function and (2) constituent functions are in the more specific yet prevalent form of a nonlinear univariate function (e.g., tanh) applied to a linear multivariate function. We establish that in each of these regimes, there exist nontrivial algebraic partial differential equations (PDEs) that are satisfied by the computed functions. These PDEs are purely in terms of the partial derivatives and are dependent only on the topology of the network. Conversely, we conjecture that such PDE constraints, once accompanied by appropriate nonsingularity conditions and perhaps certain inequalities involving partial derivatives, guarantee that the smooth function under consideration can be represented by the network. The conjecture is verified in numerous examples, including the case of tree architectures, which are of neuroscientific interest. Our approach is a step toward formulating an algebraic description of functional spaces associated with specific neural networks, and may provide useful new tools for constructing neural networks.


2019 ◽  
Vol 177 ◽  
pp. 285-296 ◽  
Author(s):  
Johnatan Carvalho Souza ◽  
João Otávio Bandeira Diniz ◽  
Jonnison Lima Ferreira ◽  
Giovanni Lucca França da Silva ◽  
Aristófanes Corrêa Silva ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document