scholarly journals Improving Precision and Reducing Runtime of Microscopic Traffic Simulators through Stratified Sampling

2013 ◽  
Vol 2013 ◽  
pp. 1-11
Author(s):  
Khewal Bhupendra Kesur

This paper examines the application of Latin Hypercube Sampling (LHS) and Antithetic Variables (AVs) to reduce the variance of estimated performance measures from microscopic traffic simulators. LHS and AV allow for a more representative coverage of input probability distributions through stratification, reducing the standard error of simulation outputs. Two methods of implementation are examined, one where stratification is applied to headways and routing decisions of individual vehicles and another where vehicle counts and entry times are more evenly sampled. The proposed methods have wider applicability in general queuing systems. LHS is found to outperform AV, and reductions of up to 71% in the standard error of estimates of traffic network performance relative to independent sampling are obtained. LHS allows for a reduction in the execution time of computationally expensive microscopic traffic simulators as fewer simulations are required to achieve a fixed level of precision with reductions of up to 84% in computing time noted on the test cases considered. The benefits of LHS are amplified for more congested networks and as the required level of precision increases.

Author(s):  
Anna Louise D. Latour ◽  
Behrouz Babaki ◽  
Siegfried Nijssen

A number of data mining problems on probabilistic networks can be modeled as Stochastic Constraint Optimization and Satisfaction Problems, i.e., problems that involve objectives or constraints with a stochastic component. Earlier methods for solving these problems used Ordered Binary Decision Diagrams (OBDDs) to represent constraints on probability distributions, which were decomposed into sets of smaller constraints and solved by Constraint Programming (CP) or Mixed Integer Programming (MIP) solvers. For the specific case of monotonic distributions, we propose an alternative method: a new propagator for a global OBDD-based constraint. We show that this propagator is (sub-)linear in the size of the OBDD, and maintains domain consistency. We experimentally evaluate the effectiveness of this global constraint in comparison to existing decomposition-based approaches, and show how this propagator can be used in combination with another data mining specific constraint present in CP systems. As test cases we use problems from the data mining literature.


2020 ◽  
Vol 31 (8) ◽  
pp. 1385-1402
Author(s):  
Sampath Kumar ◽  
M Sushama

This paper discusses an energy management system–based demand response scheduling strategy in distribution system. The proposed strategy includes customer payment minimization and network loss minimization as responsive load scheduling objectives through centralized approach. Two types of optimization strategies each based on payment minimization and network loss sensitivity are discussed in this paper. Thus, the proposed scheduling strategy can effectively resolve the optimality issue between different objectives of the distribution system scheduling under demand response penetration. The demand response scheduling strategies are simulated using standard IEEE 37 bus distribution test system through different cases of scheduling and optimization scenarios. The simulation results are presented, discussed, and compared with the base test cases without demand response penetration and without optimization strategies under demand response penetration to demonstrate the effectiveness of network loss, sensitivity consideration and optimization strategies in carrying out distribution system scheduling. In addition, sensitivity analysis is performed. The variation of distribution network performance is analyzed for various test cases and scenarios at different penetration levels.


2015 ◽  
Vol 15 (01) ◽  
pp. 1450034 ◽  
Author(s):  
Xin-Dang He ◽  
Wen-Xuan Gou ◽  
Yong-Shou Liu ◽  
Zong-Zhan Gao

Using the convex model approach, the bounds of uncertain variables are only required rather than the precise probability distributions, based on which it can be made possible to conduct the reliability analysis for many complex engineering problems with limited information. This paper aims to develop a novel nonprobabilistic reliability solution method for structures with interval uncertainty variables. In order to explore the entire domain represented by interval variables, an enhanced optimal Latin hypercube sampling (EOLHS) is used to reduce the computational effort considerably. Through the proposed method, the safety degree of a structure with convex modal uncertainty can be quantitatively evaluated. More importantly, this method can be used to deal with any general problems with nonlinear and black-box performance functions. By introducing the suggested reliability method, a convex-model-based system reliability method is also formulated. Three numerical examples are investigated to demonstrate the efficiency and accuracy of the method.


2012 ◽  
Vol 15 (3) ◽  
pp. 751-762 ◽  
Author(s):  
Jeanne-Rose René ◽  
Henrik Madsen ◽  
Ole Mark

The phenomenon of urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and can have significant economic and social consequences. The complex nature of quantitative precipitation forecasts (QPFs) from numerical weather prediction (NWP) models has facilitated a need to model and manage uncertainty. This paper presents a probabilistic approach for modelling uncertainty from single-valued QPFs at different forecast lead times. The uncertainty models in the form of probability distributions of rainfall forecasts combined with a sewer model is an important advancement in real-time forecasting at the urban scale. The methodological approach utilized in this paper involves a retrospective comparison between historical forecasted rainfall from a NWP model and observed rainfall from rain gauges from which conditional probability distributions of rainfall forecasts are derived. Two different sampling methods, respectively, a direct rainfall quantile approach and the Latin hypercube sampling-based method were used to determine the uncertainty in forecasted variables (water level, volume) for a test urban area, the city of Aarhus. The results show the potential for applying probabilistic rainfall forecasts and their subsequent use in urban drainage forecasting for estimation of prediction uncertainty.


2021 ◽  
Vol 87 (1) ◽  
Author(s):  
Nicole Beisiegel ◽  
Cristóbal E. Castro ◽  
Jörn Behrens

AbstractNon-uniform, dynamically adaptive meshes are a useful tool for reducing computational complexities for geophysical simulations that exhibit strongly localised features such as is the case for tsunami, hurricane or typhoon prediction. Using the example of a shallow water solver, this study explores a set of metrics as a tool to distinguish the performance of numerical methods using adaptively refined versus uniform meshes independent of computational architecture or implementation. These metrics allow us to quantify how a numerical simulation benefits from the use of adaptive mesh refinement. The type of meshes we are focusing on are adaptive triangular meshes that are non-uniform and structured. Refinement is controlled by physics-based indicators that capture relevant physical processes and determine the areas of mesh refinement and coarsening. The proposed performance metrics take into account a number of characteristics of numerical simulations such as numerical errors, spatial resolution, as well as computing time. Using a number of test cases we demonstrate that correlating different quantities offers insight into computational overhead, the distribution of numerical error across various mesh resolutions as well as the evolution of numerical error and run-time per degree of freedom.


Sign in / Sign up

Export Citation Format

Share Document