scholarly journals IRI Performance Models for Flexible Pavements in Two-Lane Roads until First Maintenance and/or Rehabilitation Work

Coatings ◽  
2020 ◽  
Vol 10 (2) ◽  
pp. 97 ◽  
Author(s):  
Heriberto Pérez-Acebo ◽  
Alaitz Linares-Unamunzaga ◽  
Eduardo Rojí ◽  
Hernán Gonzalo-Orden

Pavement performance models play a vital role in any pavement management system. The Regional Government of Biscay (RGB) (Spain) manages a 1200 km road network and conducts pavement data collections, including the International Roughness Index (IRI) values. The aim of the paper is to develop an IRI performance model for two-lane roads with flexible pavement until the first maintenance and/or rehabilitation activity is performed. Due to the huge amount of available information, a deterministic model was selected. A literature review of deterministic models showed that, apart from age and traffic volumes, the pavement structure is a key factor. Therefore, it was decided to analyze the only road stretches whose entire pavement section was known (surface layer + base + subbase). Various variables related to age, traffic volumes and employed materials were introduced as possible factors. The multiple linear regression model with the highest coefficient of determination and all the variables significant included the real pavement age, the cumulated heavy traffic and the total thickness of bituminous layers. As the material employed in the surface layer could affect roughness progression, a qualitative variable was introduced to consider various surface materials. The model improved its accuracy, indicating that the surface layer material is also an influencing factor on IRI evolution.

Author(s):  
Heriberto Pérez Acebo ◽  
Hernán Gonzalo-Orden

Reliable pavement prediction models are needed for pavement management systems (PMS), as they are a key component to forecast future conditions of the pavement and to prioritize maintenance, rehabilitation and reconstruction strategies. The International Roughness Index (IRI) is the most used parameter worldwide for calibrating pavement roughness and measures reasonably the ride comfort perceived by occupants of passenger cars. The Regional Government of Biscay also collects this value on the road network under its control These surveys are carried out regularly in the XXI century. Several IRI performance models have been proposed by different authors and administrations, varying greatly in their comprehensiveness, the ability to predict performance with accurancy and input data requirements. The aim of this paper is to develop a roughness performance model for Biscay's roads, based on availablbe IRI data, taking into account heavy traffic volume and the age of pavement. Local characteristics as climate conditions and average rainfall are not considered. IRI performance models have been suggested for regional two lane highways with low and medium heavy traffic constructed in the last 20 years in the province of Biscay, with no treatments during their life. They can be applied for flexible pavements, but no logical coherent results have been concluded for semi-rigid pavements.DOI: http://dx.doi.org/10.4995/CIT2016.2016.4108 


Author(s):  
Richard Steinberg ◽  
Raytheon Company ◽  
Alice Diggs ◽  
Raytheon Company ◽  
Jade Driggs

Verification and validation (V&V) for human performance models (HPMs) can be likened to building a house with no bricks, since it is difficult to obtain metrics to validate a model when the system is still in development. HPMs are effective for performing trade-offs between the human system designs factors including number of operators needed, the role of automated tasks versus operator tasks, and member task responsibilities required to operate a system. On a recent government contract, our team used a human performance model to provide additional analysis beyond traditional trade studies. Our team verified the contractually mandated staff size for using the system. This task demanded that the model have sufficient fidelity to provide information for high confidence staffing decisions. It required a method for verifying and validating the model and its results to ensure that it accurately reflected the real world. The situation caused a dilemma because there was no actual system to gather real data to use to validate the model. It is a challenge to validate human performance models, since they support design decisions prior to system. For example, crew models are typically inform the design, staffing needs, and the requirements for each operator’s user interface prior to development. This paper discusses a successful case study for how our team met the V&V challenges with the US Air Force model accreditation authority and successfully accredited our human performance model with enough fidelity for requirements testing on an Air Force Command and Control program.


2021 ◽  
Vol 20 (5) ◽  
pp. 1-34
Author(s):  
Edward A. Lee

This article is about deterministic models, what they are, why they are useful, and what their limitations are. First, the article emphasizes that determinism is a property of models, not of physical systems. Whether a model is deterministic or not depends on how one defines the inputs and behavior of the model. To define behavior, one has to define an observer. The article compares and contrasts two classes of ways to define an observer, one based on the notion of “state” and another that more flexibly defines the observables. The notion of “state” is shown to be problematic and lead to nondeterminism that is avoided when the observables are defined differently. The article examines determinism in models of the physical world. In what may surprise many readers, it shows that Newtonian physics admits nondeterminism and that quantum physics may be interpreted as a deterministic model. Moreover, it shows that both relativity and quantum physics undermine the notion of “state” and therefore require more flexible ways of defining observables. Finally, the article reviews results showing that sufficiently rich sets of deterministic models are incomplete. Specifically, nondeterminism is inescapable in any system of models rich enough to encompass Newton’s laws.


Author(s):  
Miloš Petković ◽  
Vladan Tubić ◽  
Nemanja Stepanović

Design hourly volume (DHV) represents one of the most significant parameters in the procedures of developing and evaluating road designs. DHV values can be accurately and precisely calculated only on the road sections with the implemented automatic traffic counters (ATCs) which constantly monitor the traffic volume. Unfortunately, many road sections do not contain ATCs primarily because of the implementation costs. Consequently, for many years, the DHV values have been defined on the basis of occasional counting and the factors related to traffic flow variability over time. However, it has been determined that this approach has significant limitations and that the predicted values considerably deviate from the actual values. Therefore, the main objective of this paper is to develop a model which will enable DHV prediction on rural roads in cases of insufficient data. The suggested model is based on the correlation between DHVs and the parameters defining the characteristics of traffic flows, that is, the relationship between the traffic volumes on design working days and non-working days, and annual average daily traffic. The results of the conducted research indicate that the application of the proposed model enables the prediction of DHV values with a significant level of data accuracy and reliability. The coefficient of determination (R2) shows that more than 98% of the variance of the calculated DHVs was explained by the observed DHV values, while the mean error ranged from 4.86% to 7.84% depending on the number of hours for which DHV was predicted.


2021 ◽  
Author(s):  
Yasir Shoaib

The performance characteristics such as throughput, resource utilization and response time of a system can be determined through measurement, simulation modeling and analytic modeling. In this thesis, measurement and analytic modeling approaches are applied to study the performance of a Apache-PHP-PostgreSQL web application. Layered Queueing Network (LQN) analytic modeling has been used to represent the system's performance model. The measurements found from load testing are compared with model analysis results for model validation. This thesis aims to show that LQN performance models are versatile enough to allow development of highly granular and easily modifiable models of PHP-based web applications and furthermore are capable of performance prediction with sufficiently high accuracy. Lastly, the thesis also describes utilities and methods used for load testing and determination of service demand parameters in our research work which would aid in shortening time required in development and study of performance models of similar systems.


Mission Performance Models (MPM) are important to the design of modern digital avionic systems because the flight deck information is no longer obvious. In large-scale dynamic systems, necessary responses to the incoming information model should be a direct correspondence. A Mission Performance Model is an abstract representation of the activity clusters necessary to achieve mission success. The three core activity clusters are trajectory management, energy management, and attitude control and will be covered in detail. Their combined performance characteristics highlight the vehicle's kinematic attributes, which then anticipates unstable conditions. Six MPM are necessary for the effective design and employment of a modern mission-ready flight deck. We describe MPM and their structure, purpose, and operational application. Performance models have many important uses including training system definition and design, avionic system design, and safety programs.


2015 ◽  
Vol 785 ◽  
pp. 676-681 ◽  
Author(s):  
Nor Shahida Razali ◽  
Nofri Yenita Dahlan

This paper presents the concept of International Performance Measurement and Verification Protocol (IPMVP) for determining energy saving at whole facility level for an office building in Malaysia. Regression analysis is used to develop baseline model from a set of baseline data which correlates baseline energy with appropriate independents variables, i.e. Cooling Degree Days (CDD) and Number of Working Days (NWD) in this paper. In determining energy savings, the baseline energy is adjusted to the same set condition of reporting period using energy cost avoidance approach. Two types of energy saving analyses have been presented in the case study; 1) Single linear regression for each independent variable, 2) Multiple linear regression for each independent variable. Results show that NWD has coefficient of determination, R2 higher than CDD which indicates that NWD has stronger correlation with the energy use than CDD in the building. Finding also shows that the R2 for multiple linear regression model are higher than single linear regression model. This shows the fact that more than one component are affecting the energy use in the building.


2016 ◽  
Vol 16 (24) ◽  
pp. 15629-15652 ◽  
Author(s):  
Ioannis Kioutsioukis ◽  
Ulas Im ◽  
Efisio Solazzo ◽  
Roberto Bianconi ◽  
Alba Badia ◽  
...  

Abstract. Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the forecast skill, provided that certain mathematical conditions are fulfilled. In this work, four ensemble methods were applied to two different datasets, and their performance was compared for ozone (O3), nitrogen dioxide (NO2) and particulate matter (PM10). Apart from the unconditional ensemble average, the approach behind the other three methods relies on adding optimum weights to members or constraining the ensemble to those members that meet certain conditions in time or frequency domain. The two different datasets were created for the first and second phase of the Air Quality Model Evaluation International Initiative (AQMEII). The methods are evaluated against ground level observations collected from the EMEP (European Monitoring and Evaluation Programme) and AirBase databases. The goal of the study is to quantify to what extent we can extract predictable signals from an ensemble with superior skill over the single models and the ensemble mean. Verification statistics show that the deterministic models simulate better O3 than NO2 and PM10, linked to different levels of complexity in the represented processes. The unconditional ensemble mean achieves higher skill compared to each station's best deterministic model at no more than 60 % of the sites, indicating a combination of members with unbalanced skill difference and error dependence for the rest. The promotion of the right amount of accuracy and diversity within the ensemble results in an average additional skill of up to 31 % compared to using the full ensemble in an unconditional way. The skill improvements were higher for O3 and lower for PM10, associated with the extent of potential changes in the joint distribution of accuracy and diversity in the ensembles. The skill enhancement was superior using the weighting scheme, but the training period required to acquire representative weights was longer compared to the sub-selecting schemes. Further development of the method is discussed in the conclusion.


2018 ◽  
Author(s):  
Ming Yang ◽  
Louis Z. Yang

ABSTRACTWhat values of relative numerical tolerance should be chosen in simulation of a deterministic model of a biochemical reaction is unclear, which impairs the modeling effort since the simulation outcomes of a model may depend on the relative numerical tolerance values. In an attempt to provide a guideline to selecting appropriate numerical tolerance values in simulation of in vivo biochemical reactions, reasonable numerical tolerance values were estimated based on the uncertainty principle and assumptions of related cellular parameters. The calculations indicate that relative numerical tolerance values can be reasonably set at or around 10−4 for the concentrations expressed in ng/L. This work also suggests that further reducing relative numerical values may result in erroneous simulation results.


2017 ◽  
Author(s):  
Nuno R. Nené ◽  
Alistair S. Dunham ◽  
Christopher J. R. Illingworth

ABSTRACTA common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the non-deterministic properties of mutation in a finite population. We propose an alternative approach which corrects for this error, which we denote the delay-deterministic model. Applying our model to a simple evolutionary system we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model.


Sign in / Sign up

Export Citation Format

Share Document