Auto-mutual information function of the EEG as a measure of depth of anesthesia

Author(s):  
B. Julitta ◽  
M. Vallverdu ◽  
U. S. P. Melia ◽  
N. Tupaika ◽  
M. Jospin ◽  
...  
Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1357
Author(s):  
Katrin Sophie Bohnsack ◽  
Marika Kaden ◽  
Julia Abel ◽  
Sascha Saralajew ◽  
Thomas Villmann

In the present article we propose the application of variants of the mutual information function as characteristic fingerprints of biomolecular sequences for classification analysis. In particular, we consider the resolved mutual information functions based on Shannon-, Rényi-, and Tsallis-entropy. In combination with interpretable machine learning classifier models based on generalized learning vector quantization, a powerful methodology for sequence classification is achieved which allows substantial knowledge extraction in addition to the high classification ability due to the model-inherent robustness. Any potential (slightly) inferior performance of the used classifier is compensated by the additional knowledge provided by interpretable models. This knowledge may assist the user in the analysis and understanding of the used data and considered task. After theoretical justification of the concepts, we demonstrate the approach for various example data sets covering different areas in biomolecular sequence analysis.


1992 ◽  
Vol 02 (01) ◽  
pp. 137-154 ◽  
Author(s):  
WENTIAN LI

This paper aims at understanding the statistical features of nucleic acid sequences from the knowledge of the dynamical process that produces them. Two studies are carried out: first, mutual information function of the limiting sequences generated by simple sequence manipulation dynamics with replications and mutations are calculated numerically (sometimes analytically). It is shown that elongation and replication can easily produce long-range correlations. These long range correlations could be destroyed in various degrees by mutation in different sequence manipulation models. Second, mutual information functions for several human nucleic acids sequences are determined. It is observed that intron sequences (noncoding sequences) tend to have longer correlation lengths than exon sequences (protein-coding sequences).


2011 ◽  
Vol 225-226 ◽  
pp. 601-604
Author(s):  
Gao Rong Zeng ◽  
Jian Ming Liu ◽  
Ai Wen Jiang

A mutual information function was defined as a criterion measuring the robustness of watermarking algorithm. Considering QIM scheme, error probability of watermarking can be calculated to validate the measurement of mutual information function. By mean of numerical computation, mutual information under Gaussian noise and uniform noise is calculated with change of noise standard deviation. In the experiment, an audio section is selected as the host and their third lever wavelet detail coefficients are quantified according to watermark bit series. Experiment results show that statistic Bit Error Rate (BER) is matched with evaluation conclusion of mutual information method when step is on the small side. Mutual information function can be selected as a cost function to evaluate the robustness of watermarking algorithm, and predict the BER.


Sign in / Sign up

Export Citation Format

Share Document