Information Rate Analysis of a Synaptic Release Site Using a Two-State Model of Short-Term Depression

2017 ◽  
Vol 29 (6) ◽  
pp. 1528-1560 ◽  
Author(s):  
Mehrdad Salmasi ◽  
Martin Stemmler ◽  
Stefan Glasauer ◽  
Alex Loebel

Synapses are the communication channels for information transfer between neurons; these are the points at which pulse-like signals are converted into the stochastic release of quantized amounts of chemical neurotransmitter. At many synapses, prior neuronal activity depletes synaptic resources, depressing subsequent responses of both spontaneous and spike-evoked releases. We analytically compute the information transmission rate of a synaptic release site, which we model as a binary asymmetric channel. Short-term depression is incorporated by assigning the channel a memory of depth one. A successful release, whether spike evoked or spontaneous, decreases the probability of a subsequent release; if no release occurs on the following time step, the release probabilities recover back to their default values. We prove that synaptic depression can increase the release site’s information rate if spontaneous release is more strongly depressed than spike-evoked release. When depression affects spontaneous and evoked release equally, the information rate must invariably decrease, even when the rate is normalized by the resources used for synaptic transmission. For identical depression levels, we analytically disprove the hypothesis, at least in this simplified model, that synaptic depression serves energy- and information-efficient encoding.

2019 ◽  
Vol 15 (1) ◽  
pp. e1006666 ◽  
Author(s):  
Mehrdad Salmasi ◽  
Alex Loebel ◽  
Stefan Glasauer ◽  
Martin Stemmler

Cell Reports ◽  
2018 ◽  
Vol 22 (12) ◽  
pp. 3339-3350 ◽  
Author(s):  
Daehun Park ◽  
Unghwi Lee ◽  
Eunji Cho ◽  
Haiyan Zhao ◽  
Jung Ah Kim ◽  
...  

Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 756 ◽  
Author(s):  
Mehrdad Salmasi ◽  
Martin Stemmler ◽  
Stefan Glasauer ◽  
Alex Loebel

Action potentials (spikes) can trigger the release of a neurotransmitter at chemical synapses between neurons. Such release is uncertain, as it occurs only with a certain probability. Moreover, synaptic release can occur independently of an action potential (asynchronous release) and depends on the history of synaptic activity. We focus here on short-term synaptic facilitation, in which a sequence of action potentials can temporarily increase the release probability of the synapse. In contrast to the phenomenon of short-term depression, quantifying the information transmission in facilitating synapses remains to be done. We find rigorous lower and upper bounds for the rate of information transmission in a model of synaptic facilitation. We treat the synapse as a two-state binary asymmetric channel, in which the arrival of an action potential shifts the synapse to a facilitated state, while in the absence of a spike, the synapse returns to its baseline state. The information bounds are functions of both the asynchronous and synchronous release parameters. If synchronous release facilitates more than asynchronous release, the mutual information rate increases. In contrast, short-term facilitation degrades information transmission when the synchronous release probability is intrinsically high. As synaptic release is energetically expensive, we exploit the information bounds to determine the energy–information trade-off in facilitating synapses. We show that unlike information rate, the energy-normalized information rate is robust with respect to variations in the strength of facilitation.


Author(s):  
D. Van Dyck

An (electron) microscope can be considered as a communication channel that transfers structural information between an object and an observer. In electron microscopy this information is carried by electrons. According to the theory of Shannon the maximal information rate (or capacity) of a communication channel is given by C = B log2 (1 + S/N) bits/sec., where B is the band width, and S and N the average signal power, respectively noise power at the output. We will now apply to study the information transfer in an electron microscope. For simplicity we will assume the object and the image to be onedimensional (the results can straightforwardly be generalized). An imaging device can be characterized by its transfer function, which describes the magnitude with which a spatial frequency g is transferred through the device, n is the noise. Usually, the resolution of the instrument ᑭ is defined from the cut-off 1/ᑭ beyond which no spadal information is transferred.


Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 533
Author(s):  
Milan S. Derpich ◽  
Jan Østergaard

We present novel data-processing inequalities relating the mutual information and the directed information in systems with feedback. The internal deterministic blocks within such systems are restricted only to be causal mappings, but are allowed to be non-linear and time varying, and randomized by their own external random input, can yield any stochastic mapping. These randomized blocks can for example represent source encoders, decoders, or even communication channels. Moreover, the involved signals can be arbitrarily distributed. Our first main result relates mutual and directed information and can be interpreted as a law of conservation of information flow. Our second main result is a pair of data-processing inequalities (one the conditional version of the other) between nested pairs of random sequences entirely within the closed loop. Our third main result introduces and characterizes the notion of in-the-loop (ITL) transmission rate for channel coding scenarios in which the messages are internal to the loop. Interestingly, in this case the conventional notions of transmission rate associated with the entropy of the messages and of channel capacity based on maximizing the mutual information between the messages and the output turn out to be inadequate. Instead, as we show, the ITL transmission rate is the unique notion of rate for which a channel code attains zero error probability if and only if such an ITL rate does not exceed the corresponding directed information rate from messages to decoded messages. We apply our data-processing inequalities to show that the supremum of achievable (in the usual channel coding sense) ITL transmission rates is upper bounded by the supremum of the directed information rate across the communication channel. Moreover, we present an example in which this upper bound is attained. Finally, we further illustrate the applicability of our results by discussing how they make possible the generalization of two fundamental inequalities known in networked control literature.


Sign in / Sign up

Export Citation Format

Share Document