simulated time
Recently Published Documents


TOTAL DOCUMENTS

138
(FIVE YEARS 21)

H-INDEX

18
(FIVE YEARS 2)

2021 ◽  
Vol 10 (3) ◽  
pp. 31-38
Author(s):  
Stefano Menichetti ◽  
Stefano Tessitore

This paper highlights the potentiality of the time series decomposition applied to transient regime groundwater flow models, as water balance management tool. In particular, this work presents results obtained by applying statistical analysis to some observed time series and to time series derived from the groundwater flow model of the coastal plain of Cecina (Tuscany region, Italy), developed in transient regime within the period 2005-2017. The time series of rainfall, river stage and hydraulic heads were firstly analysed, and then time series decomposition was applied to the “accumulated net storage”, to finally discern and quantify two meaningful components of the groundwater budget, the regulatory reserve (Wr = 22 Mm3) and the seasonal resource (Wd = 2.5 Mm3). These values compared with withdrawal volumes (average of 6.4 Mm3/y within the period 2005-2017) allowed to highlight potentially critical balance conditions, especially in periods with repeated negative climatic trends. Operational monitoring and modeling as following corrective and planning actions for the groundwater resource are suggested.


Water ◽  
2021 ◽  
Vol 13 (14) ◽  
pp. 1944
Author(s):  
Haitham H. Mahmoud ◽  
Wenyan Wu ◽  
Yonghao Wang

This work develops a toolbox called WDSchain on MATLAB that can simulate blockchain on water distribution systems (WDS). WDSchain can import data from Excel and EPANET water modelling software. It extends the EPANET to enable simulation blockchain of the hydraulic data at any intended nodes. Using WDSchain will strengthen network automation and the security in WDS. WDSchain can process time-series data with two simulation modes: (1) static blockchain, which takes a snapshot of one-time interval data of all nodes in WDS as input and output into chained blocks at a time, and (2) dynamic blockchain, which takes all simulated time-series data of all the nodes as input and establishes chained blocks at the simulated time. Five consensus mechanisms are developed in WDSchain to provide data at different security levels using PoW, PoT, PoV, PoA, and PoAuth. Five different sizes of WDS are simulated in WDSchain for performance evaluation. The results show that a trade-off is needed between the system complexity and security level for data validation. The WDSchain provides a methodology to further explore the data validation using Blockchain to WDS. The limitations of WDSchain do not consider selection of blockchain nodes and broadcasting delay compared to commercial blockchain platforms.


2021 ◽  
pp. 1-21
Author(s):  
David O. Cohen ◽  
Sohaila M. G. Aboutaleb ◽  
Amy J. Wagoner Johnson ◽  
Julián A. Norato

Abstract This work introduces a computational method for designing bone scaffolds for maximum bone growth. A mechanobiological model of bone adaptation is used to compute the bone growth, taking into account the shape of the defect, the applied loading, and the existing density distribution of the bone in which the scaffold has been implanted. Numerical homogenization and a geometry projection technique are used to efficiently obtain surrogates of the effective elastic and diffusive properties of the scaffold as a function of the scaffold design and the bone density. These property surrogates are in turn used to perform bone adaptation simulations of the scaffold-bone system for a sampling of scaffold designs. Surrogates of the bone growth in the scaffold at the end of the simulated time and of the strain energy of the scaffold at implantation time are subsequently constructed from these simulations. Using these surrogates, we optimize the design of a scaffold implanted in a rabbit femur to maximize bone growth into the scaffold while ensuring a minimum stiffness at implantation. The results of the optimization demonstrate the effectiveness of the proposed design methodology and they provide evidence that designing a scaffold with regards to bone adaptation yields larger bone growth than considering only mechanical criteria.


2021 ◽  
Vol 17 (1) ◽  
pp. e1008567
Author(s):  
Michael C. Tackenberg ◽  
Jacob J. Hughey

The chi-square periodogram (CSP), developed over 40 years ago, continues to be one of the most popular methods to estimate the period of circadian (circa 24-h) rhythms. Previous work has indicated the CSP is sometimes less accurate than other methods, but understanding of why and under what conditions remains incomplete. Using simulated rhythmic time-courses, we found that the CSP is prone to underestimating the period in a manner that depends on the true period and the length of the time-course. This underestimation bias is most severe in short time-courses (e.g., 3 days), but is also visible in longer simulated time-courses (e.g., 12 days) and in experimental time-courses of mouse wheel-running and ex vivo bioluminescence. We traced the source of the bias to discontinuities in the periodogram that are related to the number of time-points the CSP uses to calculate the observed variance for a given test period. By revising the calculation to avoid discontinuities, we developed a new version, the greedy CSP, that shows reduced bias and improved accuracy. Nonetheless, even the greedy CSP tended to be less accurate on our simulated time-courses than an alternative method, namely the Lomb-Scargle periodogram. Thus, although our study describes a major improvement to a classic method, it also suggests that users should generally avoid the CSP when estimating the period of biological rhythms.


Author(s):  
Mia S Lundkvist ◽  
Hans-Günter Ludwig ◽  
Remo Collet ◽  
Thomas Straus

Abstract The granulation background seen in the power spectrum of a solar-like oscillator poses a serious challenge for extracting precise and detailed information about the stellar oscillations. Using a 3D hydrodynamical simulation of the Sun computed with CO5BOLD, we investigate various background models to infer, using a Bayesian methodology, which one provides the best fit to the background in the simulated power spectrum. We find that the best fit is provided by an expression including the overall power level and two characteristic frequencies, one with an exponent of 2 and one with a free exponent taking on a value around 6. We assess the impact of the 3D hydro-code on this result by repeating the analysis with a simulation from Stagger and find that the main conclusion is unchanged. However, the details of the resulting best fits differ slightly between the two codes, but we explain this difference by studying the effect of the spatial resolution and the duration of the simulation on the fit. Additionally, we look into the impact of adding white noise to the simulated time series as a simple way to mimic a real star. We find that, as long as the noise level is not too low, the results are consistent with the no-noise case.


2020 ◽  
Vol 4 ◽  
pp. 27
Author(s):  
Daniel M. Weinberger ◽  
Joshua L. Warren

When evaluating the effects of vaccination programs, it is common to estimate changes in rates of disease before and after vaccine introduction. There are a number of related approaches that attempt to adjust for trends unrelated to the vaccine and to detect changes that coincide with introduction. However, characteristics of the data can influence the ability to estimate such a change. These include, but are not limited to, the number of years of available data prior to vaccine introduction, the expected strength of the effect of the intervention, the strength of underlying secular trends, and the amount of unexplained variability in the data. Sources of unexplained variability include model misspecification, epidemics due to unidentified pathogens, and changes in ascertainment or coding practice among others. In this study, we present a simple simulation framework for estimating the power to detect a decline and the precision of these estimates. We use real-world data from a pre-vaccine period to generate simulated time series where the vaccine effect is specified a priori. We present an interactive web-based tool to implement this approach. We also demonstrate the use of this approach using observed data on pneumonia hospitalization from the states in Brazil from a period prior to introduction of pneumococcal vaccines to generate the simulated time series. We relate the power of the hypothesis tests to the number of cases per year and the amount of unexplained variability in the data and demonstrate how fewer years of data influence the results.


2020 ◽  
Author(s):  
Michael C. Tackenberg ◽  
Jacob J. Hughey

AbstractThe chi-square periodogram (CSP), developed over 40 years ago, continues to be one of the most popular methods to estimate the period of circadian (circa 24-h) rhythms. Previous work has indicated the CSP is sometimes less accurate than other methods, but understanding of why and under what conditions remains incomplete. Using simulated rhythmic time-courses, we found that the CSP is prone to underestimating the period in a manner that depends on the true period and the length of the time-course. This underestimation bias is most severe in short time-courses (e.g., 3 days), but is also visible in longer simulated time-courses (e.g., 12 days) and in experimental time-courses of mouse wheel-running and ex vivo bioluminescence. We traced the source of the bias to discontinuities in the periodogram that are related to the number of time-points the CSP uses to calculate the observed variance for a given test period. By revising the calculation to avoid discontinuities, we developed a new version, the greedy CSP, that shows reduced bias and improved accuracy. Nonetheless, even the greedy CSP tended to be less accurate on our simulated time-courses than an alternative method, namely the Lomb-Scargle periodogram. Thus, although our study describes a major improvement to a classic method, it also suggests that users should generally avoid the CSP when estimating the period of biological rhythms.


2020 ◽  
Vol 19 (04) ◽  
pp. 2050038
Author(s):  
Keqiang Dong ◽  
Xiaofang Zhang

The fractional cumulative residual entropy is not only a powerful tool for the analysis of complex system, but also a promising way to analyze time series. In this paper, we present an approach to measure the uncertainty of non-stationary time series named higher-order multiscale fractional cumulative residual entropy. We describe how fractional cumulative residual entropy may be calculated based on second-order, third-order, fourth-order statistical moments and multiscale method. The implementation of higher-order multiscale fractional cumulative residual entropy is illustrated with simulated time series generated by uniform distribution on [0, 1]. Finally, we present the application of higher-order multiscale fractional cumulative residual entropy in logistic map time series and stock markets time series, respectively.


2020 ◽  
Vol 4 ◽  
pp. 27
Author(s):  
Daniel M. Weinberger ◽  
Joshua L. Warren

When evaluating the effects of vaccination programs, it is common to estimate changes in rates of disease before and after vaccine introduction. There are a number of related approaches that attempt to adjust for trends unrelated to the vaccine and to detect changes that coincide with introduction. However, characteristics of the data can influence the ability to estimate such a change. These include, but are not limited to, the number of years of available data prior to vaccine introduction, the expected strength of the effect of the intervention, the strength of underlying secular trends, and the amount of unexplained variability in the data. Sources of unexplained variability include model misspecification, epidemics due to unidentified pathogens, and changes in ascertainment or coding practice among others. In this study, we present a simple simulation framework for estimating the power to detect a decline and the precision of these estimates. We use real-world data from a pre-vaccine period to generate simulated time series where the vaccine effect is specified a priori. We present an interactive web-based tool to implement this approach. We also demonstrate the use of this approach using observed data on pneumonia hospitalization from the states in Brazil from a period prior to introduction of pneumococcal vaccines to generate the simulated time series. We relate the power of the hypothesis tests to the number of cases per year and the amount of unexplained variability in the data and demonstrate how fewer years of data influence the results.


2020 ◽  
Vol 8 (1) ◽  
pp. 120-127
Author(s):  
Fedir Zhuravka ◽  
Hanna Filatova ◽  
John O. Aiyedogbon

The paper explores theoretical and practical aspects of forecasting the government debt in Ukraine. A visual analysis of changes in the amount of government debt was conducted, which has made it possible to conclude about the deepening of the debt crisis in the country. The autoregressive integrated moving average (ARIMA) is considered as the basic forecasting model; besides, the model work and its diagnostics are estimated. The EViews software package illustrates the procedure for forecasting the Ukrainian government debt for the ARIMA model: the series for stationarity was tested, the time series of monthly government debt was converted into stationary by making a number of transformations and determining model parameters; as a result, the most optimal specification for the ARIMA model was chosen.Based on the simulated time series, it is concluded that ARIMA tools can be used to predict the government debt values.


Sign in / Sign up

Export Citation Format

Share Document