scholarly journals Probabilistic Modelling for Incorporating Uncertainty in Least Cost Path Results: a Postdictive Roman Road Case Study

Author(s):  
Joseph Lewis

AbstractThe movement of past peoples in the landscape has been studied extensively through the use of least cost path (LCP) analysis. Although methodological issues of applying LCP analysis in archaeology have frequently been discussed, the effect of DEM error on LCP results has not been fully assessed. Due to this, the reliability of the LCP result is undermined, jeopardising how well the method can confidently be used to model past movement. To strengthen the reliability of LCP results, this research proposes the use of Monte Carlo simulation as a method for incorporating and propagating the effects of error on LCP results. Focusing on vertical error, random error fields are calculated and incorporated into the documented and reproducible LCP modelling process using the R package leastcostpath. By graphically communicating the impact of vertical error using probabilistic LCPs, uncertainty in the results can be taken into account when interpreting LCPs. The method is applied to a Roman road case study, finding that the incorporation of vertical error results in the identification of multiple ‘least cost’ routes within the landscape. Furthermore, the deviation between the roman road and the probabilistic LCP suggests that the location of the roman road was influenced by additional factors other than minimising energy expenditure. This research finds that the probabilistic LCP derived using Monte Carlo simulation is a viable method for the graphical communication of the uncertainty caused by error within the input data used within the LCP modelling process. Therefore, it is recommended that probabilistic LCPs become the default approach when modelling movement using input data that contains errors.

2020 ◽  
Author(s):  
Joseph Lewis

The movement of past peoples in the landscape has been studied extensively through the use of Least Cost Path (LCP) analysis. Although methodological issues of applying LCP analysis in Archaeology have frequently been discussed, the effect of vertical error in the DEM on LCP results has not been fully assessed. This research proposes the use of Monte Carlo simulation as a method for incorporating and propagating the effects of vertical error on LCP results. Random error fields representing the vertical error of the DEM are calculated and incorporated into the documented and reproducible LCP modelling process using the R package leastcostpath. By incorporating vertical error into the LCP modelling process the accuracy of the LCP results can be understood probabilistically, with the likelihood of obtaining an LCP result quantified. Furthermore, the effect of incorporating vertical error on the LCP results can be expressed through the use of probabilistic LCPs, allowing for a graphical representation of the uncertainty in the LCP calculation, as well as identifying the most probable location of the ‘true’ least cost path. The method of understanding LCP results probabilistically is applied to a Roman road case study, finding that the accuracy of the LCP from south-to-north without incorporating vertical error is not representative of the LCP population with vertical error accounted for. In contrast, the accuracy of the LCP without incorporating vertical error from north-to-south is representative of the LCP population. The use of probabilistic LCPs suggests that the location of the Roman road in the southern section of the study area was selected to minimise the time taken to move up and down slope, irrespective of the direction of movement. However, the identification of two corridors of similar likelihood of containing the ‘true’ location of the LCP in the northern section when modelling movement south-to-north suggests that the input data and parameters used in the LCP analysis are unable to discern which corridor contains the most probable ‘true’ location of the LCP. Therefore, this research suggests that different input data and parameters are used and tested.


Author(s):  
Yi Yu ◽  
Kristian Norland

Abstract Subsea 7 is currently planning for the installation of PiP flowlines in the Norwegian Sea. A case study has been performed for a 8 × 12in PiP to be installed in a water depth between 320 m to 420 m. Fishing activities are frequent in this area. Therefore, the integrity of the pipeline in case of trawl pull-over must be checked. It is found that pipelines with residual curvatures could behave very differently from pipelines without residual curvatures when they are pulled over by trawl gear. However, the effect of residual curvature on pull-over resistance capacity of rigid pipelines has not been mentioned in DNVGL-RP-F111 [1]. Therefore, an optimised methodology involving FE analyses and Monte Carlo simulation has been used in this project to check the integrity of the pipe-in-pipe flowline for the trawl pull-over load case. This paper focuses on the FE analyses of the pipe-in-pipe flowline pulled over by trawl gear. The related Monte Carlo simulation has been discussed elsewhere [2]. To understand in detail the behaviour of the pipeline with trawl pull-over loading, the pipeline was modelled using a combination of beam, shell and brick (solid) elements. The advantage of the model was demonstrated by comparing output from the model with corresponding output using beam elements. The effects of some result-sensitive parameters were studied, which include centralizer location, pressure, trawl contact area and wall thickness. Special attention was paid to these parameters because their effects are not able to be captured with the normal beam element. Finally, the impact of residual curvatures on the trawl pull-over behaviour was studied. It was found that the pipeline pull-over resistance capacity is sensitive to residual curvature direction and contact location, but not sensitive to RC spacing and RC shape. Based on the advantage of this analysis methodology, it is believed to be a good option for pipeline trawl pull-over analysis, especially with complex pipeline configuration.


Author(s):  
Ignacio Sepulveda ◽  
Jennifer S. Haase ◽  
Philip L.-F. Liu ◽  
Mircea Grigoriu ◽  
Brook Tozer ◽  
...  

We describe the uncertainties of altimetry-predicted bathymetry models and then quantify the impact of this uncertainty in tsunami hazard assessments. The study consists of three stages. First, we study the statistics of errors of altimetry-predicted bathymetry models. Second, we employ these statistics to propose a random field model for errors anywhere. Third, we use bathymetry samples to conduct a Monte Carlo simulation and describe the tsunami response uncertainty. We found that bathymetry uncertainties have a greater impact in shallow areas. We also noted that tsunami leading waves are less affected by uncertainties.Recorded Presentation from the vICCE (YouTube Link): https://youtu.be/zzL_XWWAQ7o


2021 ◽  
Vol 275 ◽  
pp. 01005
Author(s):  
Ruipeng Tan

This paper focuses on comparing portfolio management and construction before and after the coronavirus. First, this paper presents the importance of building up portfolios for investors to diversify their risks. Theories on portfolio management are discussed in this section to show how they have been developed to help on investing and reduce risk. Then, the paper moves on to show the impact of the pandemic on the financial market and portfolio management. Sample data on tech stock returns are collected to perform a Monte Carlo simulation on portfolio construction to find out the efficient portfolio before and after the COVID-19 outbreak. The efficient portfolio is build based on the Markowitz theory to find the combination. Comparisons between these portfolio constructions are made to find out the changes in portfolio management and construction under the pandemic era. In conclusion, this paper presents how pandemic has changed and impacted the investments and lists recommendations on future portfolio management and construction.


2017 ◽  
Vol 26 ◽  
pp. 44-53
Author(s):  
Enrique Campbell ◽  
Amilkar Illaya-Ayza ◽  
Joaquín Izquierdo ◽  
Rafael Pérez-García ◽  
Idel Montalvo

Water Supply Network (WSN) sectorization is a broadly known technique aimed at enhancing water supply management. In general, existing methodologies for sectorization of WSNs are limited to assessment of the impact of its implementation over reduction of background leakage, underestimating increased capacity to detect new leakage events and undermining appropriate investment substantiation. In this work, we raise this issue and put in place a methodology to optimize sectors' design. To this end, we carry out a novel combination of the Short Run Economic Leakage Level concept (SRELL- corresponding to leakage level that can occur in a WSN in a certain period of time and whose reparation would be more costly than the benefits that can be obtained). With a non-deterministic optimization method based on Genetic Algorithms (GAs) in combination with Monte Carlo simulation. As an example of application, methodology is implemented over a 246 km pipe-long WSN, reporting 72 397 $/year as net profit.


Author(s):  
Shreshta Rajakumar Deshpande ◽  
Shobhit Gupta ◽  
Dennis Kibalama ◽  
Nicola Pivaro ◽  
Marcello Canova

Abstract Connectivity and automation have accelerated the development of algorithms that use route and real-time traffic information for improving energy efficiency. The evaluation of such benefits, however, requires establishing a reliable baseline that is representative of a real-world driving environment. In this context, virtual driver models are generally adopted to predict the vehicle speed based on route data and presence of lead vehicles, in a way that mimics the response of human drivers. This paper proposes an Enhanced Driver Model (EDM) that forecasts the human response when driving in urban conditions, considering the effects of Signal Phasing and Timing (SPaT) by introducing the concept of Line-of-Sight (LoS). The model was validated against data collected on an instrumented vehicle driven on public roads by different human subjects. Using this model, a Monte Carlo simulation is conducted to determine the statistical distribution of fuel consumption and travel time on a given route, varying driver behavior (aggressiveness), traffic conditions and SPaT. This allows one to quantify the impact of uncertainties associated to real-world driving in fuel economy estimates.


2019 ◽  
Vol 11 (3) ◽  
pp. 168781401983083
Author(s):  
Yongjun Du ◽  
Zhenggeng Ye ◽  
Pan Zhang ◽  
Yaqi Guo ◽  
Zhiqiang Cai

The construction spectrum is a useful tool to investigate the network reliability, which only depends on network structure and is called structure invariant. Importance measures are efficient tools to quantify and rank the impact of edges within a network. This study considers the K-terminal network with n edges and assumes that edges fail with an equal probability. The article focuses on investigating the importance measures of individual edge for the K-terminal network, including reliability achievement worth and reliability reduction worth, via the construction spectrum–based method. Generally, we establish the equations for reliability achievement worth and reliability reduction worth using the construction spectrum and determine the conditions under which the importance rankings generated by reliability achievement worth and reliability reduction worth only depend on the network structure through the construction spectrum. Similar results are obtained with reliability achievement worth and reliability reduction worth for pair of edges. A construction spectrum–based Monte-Carlo simulation is used to estimate reliability achievement worth and reliability reduction worth. Finally, a numerical example is presented to illustrate the application of these measures.


Sign in / Sign up

Export Citation Format

Share Document