Global Predictability of Chaotic Epidemiological Dynamics in Coupled Populations

2003 ◽  
Vol 10 (04) ◽  
pp. 311-320
Author(s):  
Matt Davison ◽  
C. Essex ◽  
J. S. Shiner

When the dynamics of an epidemic are chaotic, detailed prediction is effectively impossible, except perhaps in the short term. However, a probability distribution underlying the motion does allow for the long term prediction of statistical measures such as the mean or the standard deviation. Even this weaker long term predictability might be lost if distinct populations with chaotic dynamics are coupled. We show that such coupling can result in a phenomenon we call “sensitive dependence on neglected dynamics”. In light of this phenomenon, it is somewhat surprising that when two logistic maps are coupled, the long term predictability of the mean and standard deviation is maintained. This is true even though the probability distribution describing the time series depends on the coupling strength. The coupling-strength dependence does reveal itself in the loss of predictability of higher order moments such as skewness and kurtosis.

2018 ◽  
Vol 11 (7) ◽  
pp. 4059-4072 ◽  
Author(s):  
Sergio Fabián León-Luis ◽  
Alberto Redondas ◽  
Virgilio Carreño ◽  
Javier López-Solano ◽  
Alberto Berjón ◽  
...  

Abstract. Total ozone column measurements can be made using Brewer spectrophotometers, which are calibrated periodically in intercomparison campaigns with respect to a reference instrument. In 2003, the Regional Brewer Calibration Centre for Europe (RBCC-E) was established at the Izaña Atmospheric Research Center (Canary Islands, Spain), and since 2011 the RBCC-E has transferred its calibration based on the Langley method using travelling standard(s) that are wholly and independently calibrated at Izaña. This work is focused on reporting the consistency of the measurements of the RBCC-E triad (Brewer instruments #157, #183 and #185) made at the Izaña Atmospheric Observatory during the period 2005–2016. In order to study the long-term precision of the RBCC-E triad, it must be taken into account that each Brewer takes a large number of measurements every day and, hence, it becomes necessary to calculate a representative value of all of them. This value was calculated from two different methods previously used to study the long-term behaviour of the world reference triad (Toronto triad) and Arosa triad. Applying their procedures to the data from the RBCC-E triad allows the comparison of the three instruments. In daily averages, applying the procedure used for the world reference triad, the RBCC-E triad presents a relative standard deviation equal to σ = 0.41 %, which is calculated as the mean of the individual values for each Brewer (σ157 = 0.362 %, σ183 = 0.453 % and σ185 = 0.428 %). Alternatively, using the procedure used to analyse the Arosa triad, the RBCC-E presents a relative standard deviation of about σ = 0.5 %. In monthly averages, the method used for the data from the world reference triad gives a relative standard deviation mean equal to σ = 0.3 % (σ157 = 0.33 %, σ183 = 0.34 % and σ185 = 0.23 %). However, the procedure of the Arosa triad gives monthly values of σ = 0.5 %. In this work, two ozone data sets are analysed: the first includes all the ozone measurements available, while the second only includes the simultaneous measurements of all three instruments. Furthermore, this paper also describes the Langley method used to determine the extraterrestrial constant (ETC) for the RBCC-E triad, the necessary first step toward accurate ozone calculation. Finally, the short-term or intraday consistency is also studied to identify the effect of the solar zenith angle on the precision of the RBCC-E triad.


1947 ◽  
Vol 7 (01) ◽  
pp. 38-41 ◽  
Author(s):  
Wilfred Perks

Consider a continuous probability distribution of a variablexmeasured from its mean in units of its standard deviation, and suppose that deviations from the mean are taken irrespective of sign; letpxrepresent the ordinate applicable tox(this ordinate is, of course, the sum of the original ordinates at +xand —x). We then have, and we write.


Author(s):  
Mantas Makulavičius ◽  
Henrikas Sivilevičius

Asphalt mixture gradation homogeneity is one of the key factors for proper laying and compaction during road pavement and its long-term maintenance afterwards. To achieve the good quality asphalt mixture homogeneity of aggregates used in road pavement must be kept in mind. Regarding to this, gradation variation of five different granite aggregates fractions (0/2, 2/5, 5/8, 8/11 and 11/16) from one of the largest manufacturing plants in Lithuania were determined in this paper. Total of 244 samples were taken from conveyer belt at the manufacturing place and all the data was evaluated by statistical methods providing histograms with theoretical curves of normal distribution. After that, the results were compared to each other and the requirements issued by Lithuanian road administration authority. Regression analysis was used to determine the dependence of standard deviation of percent passing and the mean percent passing through the sieves. The obtained research findings revealed that the maximum value of standard deviation of this dependence was equal to mean of 50% percent passing. Further investigations should include other aggregates quality parameters variation and its homogeneity throughout different stages of technological and transportation processes.


Author(s):  
Mark J. DeBonis

One classic example of a binary classifier is one which employs the mean and standard deviation of the data set as a mechanism for classification. Indeed, principle component analysis has played a major role in this effort. In this paper, we propose that one should also include skew in order to make this method of classification a little more precise. One needs a simple probability distribution function which can be easily fit to a data set and use this pdf to create a classifier with improved error rates and comparable to other classifiers.


1991 ◽  
Vol 52 (2) ◽  
pp. 263-269 ◽  
Author(s):  
G. A. J. Fursey ◽  
C. A. Miles ◽  
S. J. Page ◽  
A. V. Fisher

ABSTRACTMeasurements were made of the speed of ultrasound transmission through sites in the hind limbs of 125 pedigree Hereford bulls. Twenty-five of these were measured twice at weekly intervals on three occasions prior to slaughter to assess the short-term repeatability of the measurement and the magnitude of long-term changes. Analyses of variance of the means of the measurements at two sites showed that the residual standard deviation (within animal and occasion), was 0·01 (μs/cm. There was a decrease of 0·01 (μs/cm in the group mean over the 2-week period and a significant time × animal interaction. This showed that lipid concentration at the measurement sites decreased as the bulls adjusted to their new surroundings following delivery to the Institute's farm. When a separate group of 64 bulls was measured at the farm at which they were being reared an increase in the group mean of 0·006 iμs/cm was recorded over a 30·day period, indicating an increase in lipid concentration. The residual standard deviation for that group was 0·007 μs/cm, similar to that recorded above.The mean of the reciprocal speeds at the two sites, when used in a multiple regression with live mass, yielded a residual standard deviation in predicted proportion of lean in the side of 20·0 g/kg and in total fat proportion of 22·1 g/kg. These corresponded to population standard deviations, adjusted for live mass, of 29·7 and 34·1 g/kg respectively. It was concluded that the measurement of ultrasound speed in the hind limbs of Hereford bulls could be used to predict lean proportion in the carcass. The method does not require subjective interpretation and responds equally to subcutaneous and interand intra-muscular fat.


2020 ◽  
Author(s):  
Carlos A. Prete ◽  
Lewis Buss ◽  
Amy Dighe ◽  
Victor Bertollo Porto ◽  
Darlan da Silva Candido ◽  
...  

AbstractUsing 65 transmission pairs of SARS-CoV-2 reported to the Brazilian Ministry of Health we estimate the mean and standard deviation for the serial interval to be 2.97 and 3.29 days respectively. We also present a model for the serial interval probability distribution using only two parameters.


2021 ◽  
Author(s):  
Wenting Wang ◽  
Shuiqing Yin ◽  
Bofu Yu ◽  
Shaodong Wang

Abstract. Stochastic weather generator CLIGEN can simulate long-term weather sequences as input to WEPP for erosion predictions. Its use, however, has been somewhat restricted by limited observations at high spatial-temporal resolutions. Long-term daily temperature, daily and hourly precipitation data from 2405 stations and daily solar radiation from 130 stations distributed across mainland China were collected to develop the most critical set of site-specific parameter values for CLIGEN. Universal Kriging (UK) with auxiliary covariables, longitude, latitude, elevation, and the mean annual rainfall was used to interpolate parameter values into a 10 km × 10 km grid and parameter accuracy was evaluated based on leave-one-out cross-validation. The results demonstrated that Nash-Sutcliffe efficiency coefficients (NSEs) between UK interpolated and observed parameters were greater than 0.85 for all parameters apart from the standard deviation of solar radiation, skewness coefficient of daily precipitation, and cumulative distribution of relative time to peak intensity, with relatively lower interpolation accuracy (NSE > 0.66). In addition, CLIGEN simulated daily weather sequences using UK-interpolated and observed inputs showed consistent statistics and frequency distributions. The mean absolute discrepancy between the two sequences in the average and standard deviation of the temperature was less than 0.51 °C. The mean absolute relative discrepancy for the same statistics for solar radiation, precipitation amount, duration and maximum intensity in 30-min were less than 5 %. CLIGEN parameters at the 10 km resolution would meet the minimum WEPP climate requirements throughout in mainland China. The dataset is availability at http://clicia.bnu.edu.cn/data/cligen.html and http://doi.org/10.12275/bnu.clicia.CLIGEN.CN.gridinput.001 (Wang et al., 2020).


2018 ◽  
Author(s):  
David M. Hyman ◽  
Andrea Bevilacqua ◽  
Marcus I. Bursik

Abstract. The study of volcanic mass flow hazards in a probabilistic framework centers around systematic experimental numerical modelling of the hazardous phenomenon and the subsequent generation and interpretation of a probabilistic hazard map (PHM). For a given volcanic flow (e.g., lava flow, lahar, pyroclastic flow, etc.), the PHM is typically interpreted as the point-wise probability of flow material inundation. In the current work, we present new methods for calculating spatial representations of the mean, standard deviation, median, and modal locations of the hazard's boundary as ensembles of many deterministic runs of a physical model. By formalizing its generation and properties, we show that a PHM may be used to construct these statistical measures of the hazard boundary which have been unrecognized in previous probabilistic hazard analyses. Our formalism shows that a typical PHM for a volcanic mass flow not only gives the point-wise inundation probability, but also represents a set of cumulative distribution functions for the location of the inundation boundary with a corresponding set of probability density functions. These distributions run over curves of steepest ascent on the PHM. Consequently, 2D space curves can be constructed on the map which represent the mean, median and modal locations of the likely inundation boundary. These curves give well-defined answers to the question of the likely boundary location of the area impacted by the hazard. Additionally, methods of calculation for higher moments including the standard deviation are presented which take the form of map regions surrounding the mean boundary location. These measures of central tendency and variance add significant value to spatial probabilistic hazard analyses, giving a new statistical description of the probability distributions underlying PHMs. The theory presented here may be used to construct improved hazard maps, which could prove useful for planning and emergency management purposes. This formalism also allows for application to simplified processes describable by analytic solutions. In that context, the connection between the PHM, its moments, and the underlying parameter variation is explicit, allowing for better source parameter estimation from natural data, yielding insights about natural controls on those parameters.


2020 ◽  
Vol 117 (44) ◽  
pp. 27179-27187
Author(s):  
Gerard Salter ◽  
Vaughan R. Voller ◽  
Chris Paola

The flux partitioning in delta networks controls how deltas build land and generate stratigraphy. Here, we study flux-partitioning dynamics in a delta network using a simple numerical model consisting of two orders of bifurcations. Previous work on single bifurcations has shown periodic behavior arising due to the interplay between channel deepening and downstream deposition. We find that coupling between upstream and downstream bifurcations can lead to chaos; despite its simplicity, our model generates surprisingly complex aperiodic yet bounded dynamics. Our model exhibits sensitive dependence on initial conditions, the hallmark signature of chaos, implying long-term unpredictability of delta networks. However, estimates of the predictability horizon suggest substantial room for improvement in delta-network modeling before fundamental limits on predictability are encountered. We also observe periodic windows, implying that a change in forcing (e.g., due to climate change) could cause a delta to switch from predictable to unpredictable or vice versa. We test our model by using it to generate stratigraphy; converting the temporal Lyapunov exponent to vertical distance using the mean sedimentation rate, we observe qualitatively realistic patterns such as upwards fining and scale-dependent compensation statistics, consistent with ancient and experimental systems. We suggest that chaotic behavior may be common in geomorphic systems and that it implies fundamental bounds on their predictability. We conclude that while delta “weather” (precise configuration) is unpredictable in the long-term, delta “climate” (statistical behavior) is predictable.


2019 ◽  
Vol 19 (7) ◽  
pp. 1347-1363 ◽  
Author(s):  
David M. Hyman ◽  
Andrea Bevilacqua ◽  
Marcus I. Bursik

Abstract. The study of volcanic flow hazards in a probabilistic framework centers around systematic experimental numerical modeling of the hazardous phenomenon and the subsequent generation and interpretation of a probabilistic hazard map (PHM). For a given volcanic flow (e.g., lava flow, lahar, pyroclastic flow, ash cloud), the PHM is typically interpreted as the point-wise probability of inundation by flow material. In the current work, we present new methods for calculating spatial representations of the mean, standard deviation, median, and modal locations of the hazard's boundary as ensembles of many deterministic runs of a physical model. By formalizing its generation and properties, we show that a PHM may be used to construct these statistical measures of the hazard boundary which have been unrecognized in previous probabilistic hazard analyses. Our formalism shows that a typical PHM for a volcanic flow not only gives the point-wise inundation probability, but also represents a set of cumulative distribution functions for the location of the inundation boundary with a corresponding set of probability density functions. These distributions run over curves of steepest probability gradient ascent on the PHM. Consequently, 2-D space curves can be constructed on the map which represents the mean, median, and modal locations of the likely inundation boundary. These curves give well-defined answers to the question of the likely boundary location of the area impacted by the hazard. Additionally, methods of calculation for higher moments including the standard deviation are presented, which take the form of map regions surrounding the mean boundary location. These measures of central tendency and variance add significant value to spatial probabilistic hazard analyses, giving a new statistical description of the probability distributions underlying PHMs. The theory presented here may be used to aid construction of improved hazard maps, which could prove useful for planning and emergency management purposes. This formalism also allows for application to simplified processes describable by analytic solutions. In that context, the connection between the PHM, its moments, and the underlying parameter variation is explicit, allowing for better source parameter estimation from natural data, yielding insights about natural controls on those parameters.


Sign in / Sign up

Export Citation Format

Share Document