scholarly journals A metric space approach to the information channel capacity of spike trains

2011 ◽  
Vol 12 (S1) ◽  
Author(s):  
James B Gillespie ◽  
Conor J Houghton
2019 ◽  
Author(s):  
Demetrios Xenides ◽  
Dionisia Fostiropoulou ◽  
Dimitrios S Vlachos

<p>There is a relentless effort on gaining information on the reason why some compounds could cause similar effects though they are or not structural similar. That is the chemical similarity that plays an equally important role and we approach it via metric space theory on a set of analgesic drugs and euphoric compounds. The findings of the present study are in agreement to these obtained via traditional structural indices moreover are in accord with clinical findings.</p>


2004 ◽  
Vol 04 (01) ◽  
pp. L83-L86 ◽  
Author(s):  
JONG U. KIM ◽  
LASZLO B. KISH

The error rate in a current-controlled logic microprocessor dominated by shot noise has been investigated. It is shown that the error rate increases very rapidly with increasing cutoff frequency. The maximum clock frequency of the processor, which works without errors, is obtained as a function of the operational current. The information channel capacity of the system is also studied.


2001 ◽  
Vol 01 (01) ◽  
pp. L13-L19 ◽  
Author(s):  
LASZLO B. KISH ◽  
GREGORY P. HARMER ◽  
DEREK ABBOTT

The information channel capacity of neurons is calculated in the stochastic resonance region using Shannon's formula. This quantity is an effective measure of the quality of signal transfer, unlike the information theoretic calculations previously used, which only characterize the entropy of the output and not the rate of information transfer. The Shannon channel capacity shows a well pronounced maximum versus input noise intensity. The location of the maximum is at a higher input noise level than has been observed for classical measures, such as signal-to-noise ratio.


Sign in / Sign up

Export Citation Format

Share Document