scholarly journals From Hi-C Contact Map to Three-dimensional Organization of Interphase Human Chromosomes

2020 ◽  
Author(s):  
Guang Shi ◽  
D. Thirumalai

The probabilities that two loci in chromosomes that are separated by a certain genome length can be inferred using chromosome conformation capture method and related Hi-C experiments. How to go from such maps to an ensemble of three-dimensional structures, which is an important step in understanding the way nature has solved the packaging of the hundreds of million base pair chromosomes in tight spaces, is an open problem. We created a theory based on polymer physics and the maximum entropy principle, leading to the HIPPS (Hi-C-Polymer-Physics-Structures) method allows us to go from contact maps to 3D structures. It is difficult to calculate the mean distance between loci i and j from the contact probability because the contact exists only in a fraction (unknown) of cell populations. Despite this massive heterogeneity, we first prove that there is a theoretical lower bound connecting ⟨pij ⟩and via a power-law relation. We show, using simulations of a precisely solvable model, that the overall organization is accurately captured by constructing the distance map from the contact map even when the cell population is heterogeneous, thus justifying the use of the lower bound. Building on these results and using the mean distance matrix, whose elements are , we use maximum entropy principle to reconstruct the joint distribution of spatial positions of the loci, which creates an ensemble of structures for the 23 chromosomes from lymphoblastoid cells. The HIPPS method shows that the conformations of a given chromosome are highly heterogeneous even in a single cell type. Nevertheless, the differences in the heterogeneity of the same chromosome in different cell types (normal as well as cancerous cells) can be quantitatively discerned using our theory.

Author(s):  
Aditya Nanda ◽  
M. Amin Karami ◽  
Puneet Singla

This paper uses the method of Quadratures in conjunction with the Maximum Entropy principle to investigate the effect of parametric uncertainties on the mean power output and root mean square deflection of piezoelectric vibrational energy harvesting systems. Uncertainty in parameters of harvesters could arise from insufficient manufacturing controls or change in material properties over time. We investigate bimorph based harvesters that transduce ambient vibrations to electricity via the piezoelectric effect. Three varieties of energy harvesters — Linear, Nonlinear monostable and Nonlinear bistable are considered in this research. This analysis quantitatively shows the probability density function for the mean power and root mean square deflection as a function of the probability densities of the excitation frequency, excitation amplitude, initial deflection of the bimorph and magnet gap of the energy harvester. The method of Quadratures is used for numerically integrating functions by propagating weighted points from the domain and evaluating the integral as a weighted sum of the function values. In this paper, the method of Quadratures is used for evaluating central moments of the distributions of rms deflection and mean harvested power and, then, in conjunction with the principle of Maximum Entropy (MaxEnt) an optimal density function is obtained which maximizes the entropy and satisfies the moment constraints. The The computed nonlinear density functions are validated against Monte Carlo simulations thereby demonstrating the efficiency of the approach. Further, the Maximum Entropy principle is widely applicable to uncertainty quantification of a wide range of dynamic systems.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document