average mutual information
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 6)

H-INDEX

7
(FIVE YEARS 0)

Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1324
Author(s):  
Garin Newcomb ◽  
Khalid Sayood

One of the important steps in the annotation of genomes is the identification of regions in the genome which code for proteins. One of the tools used by most annotation approaches is the use of signals extracted from genomic regions that can be used to identify whether the region is a protein coding region. Motivated by the fact that these regions are information bearing structures we propose signals based on measures motivated by the average mutual information for use in this task. We show that these signals can be used to identify coding and noncoding sequences with high accuracy. We also show that these signals are robust across species, phyla, and kingdom and can, therefore, be used in species agnostic genome annotation algorithms for identifying protein coding regions. These in turn could be used for gene identification.


2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Wang Xu ◽  
Jinfei Hu

Recently, variational mode decomposition (VMD) has attracted wide attention on mechanical vibration signal analysis. However, there are still some dilemmas in the application of VMD, such as the determination of the number of mode decomposition K and quadratic penalty term α. In order to acquire appropriate parameters of VMD, an improved parameter-adaptive VMD method based on grey wolf optimizer (GWO) is developed by taking the minimum average mutual information into consideration (GWOMI). Firstly, the parameters (K, α) are adaptively determined through GWOMI. Then, the vibration signal is decomposed by the developed method and effective modes are extracted according to the maximum kurtosis. Finally, the extracted modes are processed by Hilbert envelope analysis to acquire the incipient fault features. With the simulation and experimental analysis, it is clearly found that the developed method is effective and performs better than some existing ones.


Author(s):  
Amr M.S. Goneid

There is continuing interest in using Average Mutual Information (AMI) to quantify the pair-wise distance between dataset profiles. Among several algorithms used to find a numerical estimation of AMI, the histogram method is the most common since it provides simplicity and least cost. However, this algorithm is known to underestimate the computed entropies and to overestimate the resulting AMI.  Kernel Density Estimator (KDE)-based algorithms advanced to alleviate such systematic errors rely on bin-level smoothing. In the present work, we propose an alternative algorithm that uses smoothing on the probability distribution level. We consider several smoothing functions, both in the probability space and in its frequency space. An experimental approach is used to investigate the effect of such modification on the computation of both the entropy and the AMI. Results show that, to a significant extent, the present method is able to remove systematic errors in computing entropy and AMI. It is also shown that the present algorithm leads to better reconstruction of multivariate time series when AMI is used in conjunction with their independent components.


2019 ◽  
Vol 49 (5) ◽  
pp. 613-629
Author(s):  
Dong LI ◽  
Mingquan CHENG ◽  
Yang XU ◽  
Feng YUAN ◽  
Yinan CHEN ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document