Evaluation of Neural Network Models with Generalized Sensitivity Analysis

2000 ◽  
Vol 72 (20) ◽  
pp. 5004-5013 ◽  
Author(s):  
Peter de B. Harrington ◽  
Aaron Urbas ◽  
Chuanhao Wan
2019 ◽  
Vol 63 (4) ◽  
pp. 306-311 ◽  
Author(s):  
Anton Sysoev ◽  
Alessandro Ciurlia ◽  
Roman Sheglevatych ◽  
Semen Blyumin

As an initial stage prior to Mathematical Modeling, the information processing should provide qualitative data preparation for the construction of consistent models of technical, economic, social systems and technological processes. The question, concerning choosing the most significant input factors affecting the function of the system, is a very actual and important. This problem could be solved with the application of methods of Sensitivity Analysis. The presented paper has the purpose to show a possible approach to this problem through the method of the Analysis of Finite Fluctuations, based on Lagrange mean value theorem, to study the sensitivity of the model under consideration. The numerical example of comparing the results obtained by Sobol sensitivity coefficients, Garson algorithm and proposed approach showed the sustainability of the introduced method. There is shown, that the proposed approach is stable in the sense of applying different input datasets. In particular, the proposed approach has been applied to the construction of a neural network model identifying any anomalies present in certain medical insurances, in order to define the most significant input factors in the anomaly's detecting, discard the others and get a slim and efficient model.


2018 ◽  
Author(s):  
Simen Tennøe ◽  
Geir Halnes ◽  
Gaute T. Einevoll

AbstractComputational models in neuroscience typically contain many parameters that are poorly constrained by experimental data. Uncertainty quantification and sensitivity analysis provide rigorous procedures to quantify how the model output depends on this parameter uncertainty. Unfortunately, the application of such methods is not yet standard within the field of neuroscience.Here we present Uncertainpy, an open-source Python toolbox, tailored to perform uncertainty quantification and sensitivity analysis of neuroscience models. Uncertainpy aims to make it easy and quick to get started with uncertainty analysis, without any need for detailed prior knowledge. The toolbox allows uncertainty quantification and sensitivity analysis to be performed on already existing models without needing to modify the model equations or model implementation. Uncertainpy bases its analysis on polynomial chaos expansions, which are more efficient than the more standard Monte-Carlo based approaches.Uncertainpy is tailored for neuroscience applications by its built-in capability for calculating characteristic features in the model output. The toolbox does not merely perform a point-to- point comparison of the “raw” model output (e.g. membrane voltage traces), but can also calculate the uncertainty and sensitivity of salient model response features such as spike timing, action potential width, mean interspike interval, and other features relevant for various neural and neural network models. Uncertainpy comes with several common models and features built in, and including custom models and new features is easy.The aim of the current paper is to present Uncertainpy for the neuroscience community in a user- oriented manner. To demonstrate its broad applicability, we perform an uncertainty quantification and sensitivity analysis on three case studies relevant for neuroscience: the original Hodgkin-Huxley point-neuron model for action potential generation, a multi-compartmental model of a thalamic interneuron implemented in the NEURON simulator, and a sparsely connected recurrent network model implemented in the NEST simulator.SIGNIFICANCE STATEMENTA major challenge in computational neuroscience is to specify the often large number of parameters that define the neuron and neural network models. Many of these parameters have an inherent variability, and some may even be actively regulated and change with time. It is important to know how the uncertainty in model parameters affects the model predictions. To address this need we here present Uncertainpy, an open-source Python toolbox tailored to perform uncertainty quantification and sensitivity analysis of neuroscience models.


2005 ◽  
Vol 30 (4) ◽  
pp. 1-4 ◽  
Author(s):  
K. K. Aggarwal ◽  
Yogesh Singh ◽  
Pravin Chandra ◽  
Manimala Puri

Sign in / Sign up

Export Citation Format

Share Document