scholarly journals Spatio-temporal spike train analysis for large scale networks using the maximum entropy principle and Monte Carlo method

2013 ◽  
Vol 2013 (03) ◽  
pp. P03006 ◽  
Author(s):  
Hassan Nasser ◽  
Olivier Marre ◽  
Bruno Cessac
1983 ◽  
Vol 49 (6) ◽  
pp. 1334-1348 ◽  
Author(s):  
J. E. Dayhoff ◽  
G. L. Gerstein

Traditional spike-train analysis methods cannot identify patterns of firing that occur frequently but at arbitrary times. It is appropriate to search for recurring patterns because such patterns could be used for information transfer. In this paper, we present two methods for identifying "favored patterns" --patterns that occur more often than is reasonably expected at random. The quantized Monte Carlo method identifies and establishes significance for favored patterns whose detailed timing may vary but that do not have extra or missing spikes. The template method identifies favored patterns whose occurrences may have extra or missing spikes. This method is useful when employed after the results of the first method are known. Studies with simulated spike trains containing known interpolated patterns are used to establish the sensitivity and accuracy of the quantized Monte Carlo method. Certain trends with regard to parameters of the detected patterns and of the analysis methods are described. Application of these methods to neurophysiological data has shown that a large proportion of spike trains have favored patterns. These findings are described in the accompanying paper (3).


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document