random interaction
Recently Published Documents


TOTAL DOCUMENTS

48
(FIVE YEARS 6)

H-INDEX

11
(FIVE YEARS 1)

2020 ◽  
Author(s):  
M. Marenda ◽  
E. Lazarova ◽  
S. van de Linde ◽  
N. Gilbert ◽  
D. Michieletto

Single-Molecule Localisation Microscopy (SMLM) allows the quantitative mapping of molecules at high resolution. However, understanding the non-random interaction of proteins requires the identification of more complex patterns than those typified by standard clustering tools. Here we introduce SuperStructure, a parameter-free algorithm to quantify structures made of inter-connected clusters, such as protein gels. SuperStructure works without a priori assumptions and is thus an ideal methodology for standardised analysis of SMLM data.


Physics ◽  
2020 ◽  
Vol 2 (2) ◽  
pp. 184-196 ◽  
Author(s):  
Masha Shcherbina ◽  
Brunello Tirozzi ◽  
Camillo Tassi

We find the free-energy in the thermodynamic limit of a one-dimensional XY model associated to a system of N qubits. The coupling among the σ i z is a long range two-body random interaction. The randomness in the couplings is the typical interaction of the Hopfield model with p patterns ( p < N ), where the patterns are p sequences of N independent identically distributed random variables (i.i.d.r.v.), assuming values ± 1 with probability 1 / 2 . We show also that in the case p ≤ α N , α ≠ 0 , the free-energy is asymptotically independent from the choice of the patterns, i.e., it is self-averaging.


2018 ◽  
Vol 50 (3) ◽  
pp. 944-982 ◽  
Author(s):  
Tanguy Cabana ◽  
Jonathan D. Touboul

Abstract In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first paper, we take into account the spatial structure of the brain and consider first the presence of interaction delays that depend on the distance between cells and then the Gaussian random interaction amplitude with a mean and variance that depend on the position of the neurons and scale as the inverse of the network size. We show that the empirical measure satisfies a large deviations principle with a good rate function reaching its minimum at a unique spatially extended probability measure. This result implies an averaged convergence of the empirical measure and a propagation of chaos. The limit is characterized through a complex non-Markovian implicit equation in which the network interaction term is replaced by a nonlocal Gaussian process with a mean and covariance that depend on the statistics of the solution over the whole neural field.


Sign in / Sign up

Export Citation Format

Share Document