ON ALLOMETRY RELATIONS

2012 ◽  
Vol 26 (18) ◽  
pp. 1230010 ◽  
Author(s):  
DAMIEN WEST ◽  
BRUCE J. WEST

There are a substantial number of empirical relations that began with the identification of a pattern in data; were shown to have a terse power-law description; were interpreted using existing theory; reached the level of "law" and given a name; only to be subsequently fade away when it proved impossible to connect the "law" with a larger body of theory and/or data. Various forms of allometry relations (ARs) have followed this path. The ARs in biology are nearly two hundred years old and those in ecology, geophysics, physiology and other areas of investigation are not that much younger. In general if X is a measure of the size of a complex host network and Y is a property of a complex subnetwork embedded within the host network a theoretical AR exists between the two when Y = aXb. We emphasize that the reductionistic models of AR interpret X and Y as dynamic variables, albeit the ARs themselves are explicitly time independent even though in some cases the parameter values change over time. On the other hand, the phenomenological models of AR are based on the statistical analysis of data and interpret X and Y as averages to yield the empirical AR: 〈Y〉 = a〈X〉b. Modern explanations of AR begin with the application of fractal geometry and fractal statistics to scaling phenomena. The detailed application of fractal geometry to the explanation of theoretical ARs in living networks is slightly more than a decade old and although well received it has not been universally accepted. An alternate perspective is given by the empirical AR that is derived using linear regression analysis of fluctuating data sets. We emphasize that the theoretical and empirical ARs are not the same and review theories "explaining" AR from both the reductionist and statistical fractal perspectives. The probability calculus is used to systematically incorporate both views into a single modeling strategy. We conclude that the empirical AR is entailed by the scaling behavior of the probability density, which is derived using the probability calculus.

Author(s):  
Bruce West ◽  
Damien West

AbstractAllometry relations (ARs) in physiology are nearly two hundred years old. In general if X ij is a measure of the size of the i th member of a complex host network from species j and Y ij is a property of a complex subnetwork embedded within the host network an intraspecies AR exists between the two when Y ij = aX ijb. We emphasize that the reductionist models of AR interpret X ij and Y ij as dynamic variables, albeit the ARs themselves are explicitly time independent. On the other hand, the phenomenological models of AR are based on the statistical analysis of data and interpret 〈X i〉 and 〈Y i〉 as averages over an ensemble of individuals to yields the interspecies AR 〈Y i〉 = a〈X i〉b. Modern explanations of AR begin with the application of fractal geometry and fractal statistics to scaling phenomena. The detailed application of fractal geometry to the explanation of intraspecies ARs is a little over a decade old and although well received it has not been universally accepted. An alternate perspective is given by the interspecies AR based on linear regression analysis of fluctuating data sets. We emphasize that the intraspecies and interspecies ARs are not the same and show that the interspecies AR can only be derived from the intraspecies one for a narrow distribution of fluctuations. This condition is not satisfied by metabolic data as is shown separately for aviary and mammal data sets. The empirical distribution of metabolic allometry coefficients is shown herein to be Pareto in form. A number of reductionist arguments conclude that the allometry exponent is universal, however herein we derive a deterministic relation between the allometry exponent and the allometry coefficient using the fractional calculus. The co-variation relation violates the universality assumption. We conclude that the interspecies physiologic AR is entailed by the scaling behavior of the probability density, which is derived using the fractional probability calculus.


2014 ◽  
Vol 7 (12) ◽  
pp. 12735-12794 ◽  
Author(s):  
S. Bender ◽  
M. Sinnhuber ◽  
T. von Clarmann ◽  
G. Stiller ◽  
B. Funke ◽  
...  

Abstract. We compare the nitric oxide measurements in the mesosphere and lower thermosphere (60 to 150 km) from four instruments: ACE-FTS, MIPAS, SCIAMACHY, and SMR. We use the daily zonal mean data in that altitude range for the years 2004–2010 (ACE-FTS), 2005–2012 (MIPAS), 2008–2012 (SCIAMACHY), and 2003–2012 (SMR). We first compare the data qualitatively with respect to the morphology, focussing on the major features, and then compare the time series directly and quantitatively. In three geographical regions, we compare the vertical density profiles on coincident measurement days. Since none of the instruments delivers continuous daily measurements in this altitude region, we carried out a multi-linear regression analysis. This regression analysis considers annual and semi-annual variability in form of harmonic terms and inter-annual variability by responding linearly to the solar Lyman-α radiation index and the geomagnetic Kp index. This analysis helps to find similarities and differences in the individual data sets with respect to the inter-annual variations caused by geomagnetic and solar variability. We find that the data sets are consistent and that they only disagree on minor aspects. SMR and ACE-FTS deliver the longest time series in the mesosphere and they both agree remarkably well. The shorter time series from MIPAS and SCIAMACHY also agree with them where they overlap. The data agree within ten to twenty percent when the number densities are large, but they can differ by 50 to 100% in some cases.


1998 ◽  
Vol 06 (01n02) ◽  
pp. 135-150 ◽  
Author(s):  
D. G. Simons ◽  
M. Snellen

For a selected number of shallow water test cases of the 1997 Geoacoustic Inversion Workshop we have applied Matched-Field Inversion to determine the geoacoustic and geometric (source location, water depth) parameters. A genetic algorithm has been applied for performing the optimization, whereas the replica fields have been calculated using a standard normal-mode model. The energy function to be optimized is based on the incoherent multi-frequency Bartlett processor. We have used the data sets provided at a few frequencies in the band 25–500 Hz for a vertical line array positioned at 5 km from the source. A comparison between the inverted and true parameter values is made.


Geophysics ◽  
2017 ◽  
Vol 82 (1) ◽  
pp. G1-G21 ◽  
Author(s):  
William J. Titus ◽  
Sarah J. Titus ◽  
Joshua R. Davis

We apply a Bayesian Markov chain Monte Carlo formalism to the gravity inversion of a single localized 2D subsurface object. The object is modeled as a polygon described by five parameters: the number of vertices, a density contrast, a shape-limiting factor, and the width and depth of an encompassing container. We first constrain these parameters with an interactive forward model and explicit geologic information. Then, we generate an approximate probability distribution of polygons for a given set of parameter values. From these, we determine statistical distributions such as the variance between the observed and model fields, the area, the center of area, and the occupancy probability (the probability that a spatial point lies within the subsurface object). We introduce replica exchange to mitigate trapping in local optima and to compute model probabilities and their uncertainties. We apply our techniques to synthetic data sets and a natural data set collected across the Rio Grande Gorge Bridge in New Mexico. On the basis of our examples, we find that the occupancy probability is useful in visualizing the results, giving a “hazy” cross section of the object. We also find that the role of the container is important in making predictions about the subsurface object.


2015 ◽  
Vol 8 (10) ◽  
pp. 4171-4195 ◽  
Author(s):  
S. Bender ◽  
M. Sinnhuber ◽  
T. von Clarmann ◽  
G. Stiller ◽  
B. Funke ◽  
...  

Abstract. We compare the nitric oxide measurements in the mesosphere and lower thermosphere (60 to 150 km) from four instruments: the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE-FTS), the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY), and the Sub-Millimetre Radiometer (SMR). We use the daily zonal mean data in that altitude range for the years 2004–2010 (ACE-FTS), 2005–2012 (MIPAS), 2008–2012 (SCIAMACHY), and 2003–2012 (SMR). We first compare the data qualitatively with respect to the morphology, focussing on the major features, and then compare the time series directly and quantitatively. In three geographical regions, we compare the vertical density profiles on coincident measurement days. Since none of the instruments delivers continuous daily measurements in this altitude region, we carried out a multi-linear regression analysis. This regression analysis considers annual and semi-annual variability in the form of harmonic terms and inter-annual variability by responding linearly to the solar Lyman-α radiation index and the geomagnetic Kp index. This analysis helps to find similarities and differences in the individual data sets with respect to the inter-annual variations caused by geomagnetic and solar variability. We find that the data sets are consistent and that they only disagree on minor aspects. SMR and ACE-FTS deliver the longest time series in the mesosphere, and they agree with each other remarkably well. The shorter time series from MIPAS and SCIAMACHY also agree with them where they overlap. The data agree within 30 % when the number densities are large, but they can differ by 50 to 100 % in some cases.


2003 ◽  
Vol 17 (22n24) ◽  
pp. 4003-4012 ◽  
Author(s):  
Mogens H. Jensen ◽  
Anders Johansen ◽  
Ingve Simonsen

We consider inverse statistics in turbulence and financial data. By inverse statistics, also sometimes called exit time statistics, we "turn" the variables around such that the fluctuating variable becomes the fixed variable, while the fixed variable becomes fluctuating. In that sense we can probe distinct regimes of the data sets. In the case of turbulence, we obtain a new set of (multi)-scaling exponents which monitor the dissipation regime. In the case of economics, we obtain a distribution of waiting times needed to achieve a predefined level of return. Such a distribution typically goes through a maximum at a time called the optimal investment horizon[Formula: see text], since this defines the most likely waiting time for obtaining a given return ρ. By considering equal positive and negative levels of return, we report on a quantitative gain-loss asymmetry most pronounced for short horizons.


2001 ◽  
Vol 11 (4) ◽  
pp. 510-519
Author(s):  
Weizhao Zhao ◽  
Ofelia F. Alonso ◽  
Judith Y. Loor ◽  
Raul Busto ◽  
Myron D. Ginsberg

Object Using autoradiographic image averaging, the authors recently described prominent foci of marked glucose metabolism-greater-than-blood-flow uncoupling in the acutely traumatized rat brain. Because hypothermia is known to ameliorate injury in this and other injury models, the authors designed the present study to assess the effects of post-traumatic therapeutic hypothermia on the local cerebral metabolic rate of glucose (LCMRglu) and local cerebral blood flow (LCBF) following moderate parasagittal fluid-percussion head injury (FPI) in rats. Methods Either cranial hypothermia (30°C) or normothermia (37°C) was induced for 3 hours in matched groups of rats immediately after FPI; LCMRglu and LCBF were assessed 3 hours after concluding these temperature manipulations. In rats subjected to FPI, regardless of whether normothermia or hypothermia ensued, LCBF was reduced relative to the sham-injury groups. In addition, when FPI was followed by hypothermia (FPI–30°C group), the subsequent LCBF was significantly lower (35–38% on average) than in FPI–37°C rats. Statistical mapping of LCBF difference imaging data revealed confluent cortical and subcortical zones of significantly reduced LCBF (largely ipsilateral to the prior injury) in FPI–30°C rats relative to the FPI–37°C group. Local glucose utilization was reduced in both hemispheres of FPI–37°C rats relative to the sham-injury group and was lower in the right (traumatized) hemisphere than in the left. However, LCMRglu values were largely unaffected by temperature manipulation in either the FPI or sham-injury groups. The LCMRglu/LCBF ratio was nearly doubled in FPI–30°C rats relative to the FPI–37°C group, in a diffuse and bihemispheric fashion. Linear regression analysis comparing LCMRglu and LCBF revealed that the FPI–37°C and FPI–30°C data sets were completely nonoverlapping, whereas the two sham-injury data sets were intermixed. Conclusions Despite its proven neuroprotective efficacy, early posttraumatic hypothermia (30°C for 3 hours) nonetheless induces a moderate decline in cerebral perfusion without the (anticipated) improvement in cerebral glucose utilization, so that a state of mild metabolism-greater-than-blood-flow dissociation is perpetuated.


Plant Disease ◽  
2003 ◽  
Vol 87 (12) ◽  
pp. 1477-1486 ◽  
Author(s):  
Laura A. Furman ◽  
Norman Lalancette ◽  
James F. White

Different numbers of consecutive fungicide applications, beginning at petal fall and continuing into the summer, were examined for their effect on rusty spot epidemics. Disease progressions for each fungicide level were quantified by fitting either the logistic or monomolecular model. When the weighted absolute infection rate (ρ) and maximum disease level (Kmax) parameters were expressed as functions of the number of applications, the logistic decline model provided the best fit for five of six data sets. This model described a gradual decrease in ρ and Kmax in response to the initial fungicide application, a rapid decline in parameter values with the addition of one or two applications, and a diminished parameter response as fungicide applications continued toward the end of the epidemic. Based on examination of model behavior across all 3 years of the study, adequate management was achieved with a total of three to five fungicide applications. Additional analyses of area under the disease progress curve and final disease intensity at harvest supported these results and indicated that further reduction in fungicide usage may be possible. Unlike earlier findings, rusty spot did not significantly decrease fruit volume or weight at midseason or at harvest; as lesion density increased, fruit volume remained constant. The relationship between disease incidence and lesion density within any given year was best explained by the zero-intercept version of the exponential model. However, comparison of model parameters across years revealed significant seasonal variation. Nevertheless, the incidence-lesion density relationships were fairly uniform across years at incidence values below 0.5, where lesion density increased gradually and in a near-linear fashion.


2010 ◽  
Vol 75 (4) ◽  
pp. 483-495 ◽  
Author(s):  
Slavica Eric ◽  
Marko Kalinic ◽  
Aleksandar Popovic ◽  
Halid Makic ◽  
Elvisa Civic ◽  
...  

Aqueous solubility is an important factor influencing several aspects of the pharmacokinetic profile of a drug. Numerous publications present different methodologies for the development of reliable computational models for the prediction of solubility from structure. The quality of such models can be significantly affected by the accuracy of the employed experimental solubility data. In this work, the importance of the accuracy of the experimental solubility data used for model training was investigated. Three data sets were used as training sets - Data Set 1 containing solubility data collected from various literature sources using a few criteria (n = 319), Data Set 2 created by substituting 28 values from Data set 1 with uniformly determined experimental data from one laboratory (n = 319) and Data Set 3 created by including 56 additional components, for which the solubility was also determined under uniform conditions in the same laboratory, in the Data Set 2 (n = 375). The selection of the most significant descriptors was performed by the heuristic method, using one-parameter and multi-parameter analysis. The correlations between the most significant descriptors and solubility were established using multi-linear regression analysis (MLR) for all three investigated data sets. Notable differences were observed between the equations corresponding to different data sets, suggesting that models updated with new experimental data need to be additionally optimized. It was successfully shown that the inclusion of uniform experimental data consistently leads to an improvement in the correlation coefficients. These findings contribute to an emerging consensus that improving the reliability of solubility prediction requires the inclusion of many diverse compounds for which solubility was measured under standardized conditions in the data set.


Sign in / Sign up

Export Citation Format

Share Document