scholarly journals Large Deviations for the Empirical Measure of a Markov Chain with an Application to the Multivariate Empirical Measure

1988 ◽  
Vol 16 (4) ◽  
pp. 1496-1508 ◽  
Author(s):  
Richard S. Ellis
2018 ◽  
Vol 50 (3) ◽  
pp. 983-1004 ◽  
Author(s):  
Tanguy Cabana ◽  
Jonathan D. Touboul

Abstract We continue the analysis of large deviations for randomly connected neural networks used as models of the brain. The originality of the model relies on the fact that the directed impact of one particle onto another depends on the state of both particles, and they have random Gaussian amplitude with mean and variance scaling as the inverse of the network size. Similarly to the spatially extended case (see Cabana and Touboul (2018)), we show that under sufficient regularity assumptions, the empirical measure satisfies a large deviations principle with a good rate function achieving its minimum at a unique probability measure, implying, in particular, its convergence in both averaged and quenched cases, as well as a propagation of a chaos property (in the averaged case only). The class of model we consider notably includes a stochastic version of the Kuramoto model with random connections.


2021 ◽  
Vol 31 (6) ◽  
Author(s):  
Joris Bierkens ◽  
Pierre Nyquist ◽  
Mikola C. Schlottke

2002 ◽  
Vol 34 (2) ◽  
pp. 375-393 ◽  
Author(s):  
Nadine Guillotin-Plantard

Let (Sk)k≥0 be a Markov chain with state space E and (ξx)x∊E be a family of ℝp-valued random vectors assumed independent of the Markov chain. The ξx could be assumed independent and identically distributed or could be Gaussian with reasonable correlations. We study the large deviations of the sum


2018 ◽  
Vol 50 (3) ◽  
pp. 944-982 ◽  
Author(s):  
Tanguy Cabana ◽  
Jonathan D. Touboul

Abstract In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first paper, we take into account the spatial structure of the brain and consider first the presence of interaction delays that depend on the distance between cells and then the Gaussian random interaction amplitude with a mean and variance that depend on the position of the neurons and scale as the inverse of the network size. We show that the empirical measure satisfies a large deviations principle with a good rate function reaching its minimum at a unique spatially extended probability measure. This result implies an averaged convergence of the empirical measure and a propagation of chaos. The limit is characterized through a complex non-Markovian implicit equation in which the network interaction term is replaced by a nonlocal Gaussian process with a mean and covariance that depend on the statistics of the solution over the whole neural field.


Sign in / Sign up

Export Citation Format

Share Document