scholarly journals Information Length as a Useful Index to Understand Variability in the Global Circulation

Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 299 ◽  
Author(s):  
Eun-jin Kim ◽  
James Heseltine ◽  
Hanli Liu

With improved measurement and modelling technology, variability has emerged as an essential feature in non-equilibrium processes. While traditionally, mean values and variance have been heavily used, they are not appropriate in describing extreme events where a significant deviation from mean values often occurs. Furthermore, stationary Probability Density Functions (PDFs) miss crucial information about the dynamics associated with variability. It is thus critical to go beyond a traditional approach and deal with time-dependent PDFs. Here, we consider atmospheric data from the Whole Atmosphere Community Climate Model (WACCM) and calculate time-dependent PDFs and the information length from these PDFs, which is the total number of statistically different states that a system evolves through in time. Specifically, we consider the three cases of sampling data to investigate the distribution of information (information budget) along the altitude and longitude to gain a new perspective of understanding variabilities, correlation among different variables and regions. Time-dependent PDFs are shown to be non-Gaussian in general; the information length tends to increase with the altitude albeit in a complex form; this tendency is more robust for flows/shears than temperature. Much similarity among flows and shears in the information length is also found in comparison with the temperature. This means a strong correlation among flows/shears because of their coupling through gravity waves in this particular WACCM model. We also find the increase of the information length with the latitude and interesting hemispheric asymmetry for flows/shears/temperature, with the tendency of anti-correlation (correlation) between flows/shears and temperature at high (low) latitude. These results suggest the importance of high latitude/altitude in the information budget in the Earth’s atmosphere, the spatial gradient of the information length being a useful proxy for information flow.

2017 ◽  
pp. 136-152 ◽  
Author(s):  
V. Gazman

If we want securitization to become one of the main channels to attract funding in leasing activity, as the Bank of Russia predicts, one needs to revise some stereotypes. Relying on foreign and domestic research, the author gives a critical assessment of the postulate of the need for uniformity of securitized assets; proves that real estate, contrary to the traditional approach, rather than equipment and transport, prevails in securitization transactions, and explains why this happens. The article presents a new perspective on the behavior of issu- ers concerning the timing of securities circulation; considers feasibility approach to the calculation of variable character of leverage in leasing; explains pro and contra of evaluating the leasing market based on the volume of the portfolio of contracts; reveals the validity of ratings of bonds issued in the course of secu- ritization of leasing assets.


2014 ◽  
Vol 27 (8) ◽  
pp. 2931-2947 ◽  
Author(s):  
Ed Hawkins ◽  
Buwen Dong ◽  
Jon Robson ◽  
Rowan Sutton ◽  
Doug Smith

Abstract Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.


2018 ◽  
Vol 16 ◽  
pp. 01007
Author(s):  
Rodolfo A. Fiorini

Science does not exists to enlighten people's minds only. It mainly exists to show the educated way from quanta to qualia. And that way starts from computational competence. In previous papers published elsewhere, we have already shown that traditional Q Arithmetic can be regarded as a highly sophisticated open logic, powerful and flexible bidirectional formal language of languages, according to "Computational Information Conservation Theory" (CICT) new perspective. This new awareness can offer competitive approach to guide more effective and convenient algorithm development and application to arbitrary multiscale (AMS) biomedical system modeling and simulation. An articulated example on function computational modelling is presented and compared to standard, well-known and traditional approach. Results are critically discussed.


1985 ◽  
Vol 22 (03) ◽  
pp. 503-517
Author(s):  
Helmut Pruscha

The present paper deals with continuous-time Markov branching processes allowing immigration. The immigration rate is allowed to be random and time-dependent where randomness may stem from an external source or from state-dependence. Unlike the traditional approach, we base the analysis of these processes on the theory of multivariate point processes. Using the tools of this theory, asymptotic results on parametric inference are derived for the subcritical case. In particular, the limit distributions of some parametric estimators and of Pearson-type statistics for testing simple and composite hypotheses are established.


Author(s):  
Jochen Rau

Recent advances in quantum technology – from quantum computers and simulators to communication and metrology – have not only opened up a whole new world of applications but also changed the understanding of quantum theory itself. This text introduces quantum theory entirely from this new perspective. It does away with the traditional approach to quantum theory as a theory of microscopic matter, and focuses instead on quantum theory as a framework for information processing. Accordingly, the emphasis is on concepts like measurement, probability, statistical correlations, and transformations, rather than waves and particles. The text begins with experimental evidence that forces one to abandon the classical description and to re-examine such basic notions as measurement, probability, and state. Thorough investigation of these concepts leads to the alternative framework of quantum theory. The requisite mathematics is developed and linked to its operational meaning. This part of the text culminates in an exploration of some of the most vexing issues of quantum theory, regarding locality, non-contextuality, and realism. The second half of the text explains how the peculiar features of quantum theory are harnessed to tackle information processing tasks that are intractable or even impossible classically. It provides the tools for understanding and designing the pertinent protocols, and discusses a range of examples representative of current quantum technology.


2018 ◽  
Vol 217 (11) ◽  
pp. 4025-4048 ◽  
Author(s):  
Yu Chen ◽  
Yang Zhang ◽  
Yuchuan Wang ◽  
Liguo Zhang ◽  
Eva K. Brinkman ◽  
...  

While nuclear compartmentalization is an essential feature of three-dimensional genome organization, no genomic method exists for measuring chromosome distances to defined nuclear structures. In this study, we describe TSA-Seq, a new mapping method capable of providing a “cytological ruler” for estimating mean chromosomal distances from nuclear speckles genome-wide and for predicting several Mbp chromosome trajectories between nuclear compartments without sophisticated computational modeling. Ensemble-averaged results in K562 cells reveal a clear nuclear lamina to speckle axis correlated with a striking spatial gradient in genome activity. This gradient represents a convolution of multiple spatially separated nuclear domains including two types of transcription “hot zones.” Transcription hot zones protruding furthest into the nuclear interior and positioning deterministically very close to nuclear speckles have higher numbers of total genes, the most highly expressed genes, housekeeping genes, genes with low transcriptional pausing, and super-enhancers. Our results demonstrate the capability of TSA-Seq for genome-wide mapping of nuclear structure and suggest a new model for spatial organization of transcription and gene expression.


1997 ◽  
Vol 17 (3_suppl) ◽  
pp. 42-45 ◽  
Author(s):  
David N. Churchill

The theoretical constructs indicate that, for a 70kg high transport anephric patient, adequate dialysis requires a weekly KtN of 2.0 2.25 (1–5). The prospective cohort studies, with one exception, suggest that better survival requires a weekly KtN >1.89 (7–11). Multivariate analyses confirm the statistical association of patient survival with higher Kt/V(14–17) and with higher CCr (16,17). The use of initial values in one study (15), mean values in two studies (14,16), and time -dependent values in another (17) makes comparison difficult. In general, higher values are associated with better survival and are consistent with the values suggested by theoretical constructs (i.e., KtN 2.0 2.25).


2019 ◽  
Vol 29 (6) ◽  
pp. 1297-1315 ◽  
Author(s):  
Filip Tronarp ◽  
Hans Kersting ◽  
Simo Särkkä ◽  
Philipp Hennig

Abstract We formulate probabilistic numerical approximations to solutions of ordinary differential equations (ODEs) as problems in Gaussian process (GP) regression with nonlinear measurement functions. This is achieved by defining the measurement sequence to consist of the observations of the difference between the derivative of the GP and the vector field evaluated at the GP—which are all identically zero at the solution of the ODE. When the GP has a state-space representation, the problem can be reduced to a nonlinear Bayesian filtering problem and all widely used approximations to the Bayesian filtering and smoothing problems become applicable. Furthermore, all previous GP-based ODE solvers that are formulated in terms of generating synthetic measurements of the gradient field come out as specific approximations. Based on the nonlinear Bayesian filtering problem posed in this paper, we develop novel Gaussian solvers for which we establish favourable stability properties. Additionally, non-Gaussian approximations to the filtering problem are derived by the particle filter approach. The resulting solvers are compared with other probabilistic solvers in illustrative experiments.


2016 ◽  
Vol 23 (4) ◽  
pp. 275-300 ◽  
Author(s):  
M.F. Huang ◽  
Song Huang ◽  
He Feng ◽  
Wenjuan Lou

Sign in / Sign up

Export Citation Format

Share Document