scholarly journals A Hierarchical Route Guidance Framework for Off-Road Connected Vehicles

Author(s):  
Judhajit Roy ◽  
Nianfeng Wan ◽  
Angshuman Goswami ◽  
Ardalan Vahidi ◽  
Paramsothy Jayakumar ◽  
...  

A new framework for route guidance, as part of a path decision support tool, for off-road driving scenarios is presented in this paper. The algorithm accesses information gathered prior to and during a mission which are stored as layers of a central map. The algorithm incorporates a priori knowledge of the low resolution soil and elevation information and real-time high-resolution information from on-board sensors. The challenge of high computational cost to find the optimal path over a large-scale high-resolution map is mitigated by the proposed hierarchical path planning algorithm. A dynamic programming (DP) method generates the globally optimal path approximation based on low-resolution information. The optimal cost-to-go from each grid cell to the destination is calculated by back-stepping from the target and stored. A model predictive control algorithm (MPC) operates locally on the vehicle to find the optimal path over a moving radial horizon. The MPC algorithm uses the stored global optimal cost-to-go map in addition to high resolution and locally available information. Efficacy of the developed algorithm is demonstrated in scenarios simulating static and moving obstacles avoidance, path finding in condition-time-variant environments, eluding adversarial line of sight detection, and connected fleet cooperation.

2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


2016 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 km up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PBytes of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Center (LRZ) in Garching, Germany. About 140 TBytes of post-processed data are stored on the CINECA supercomputing center archives and are freely accessible to the community thanks to an EUDAT Data Pilot project. This paper presents the technical and scientific setup of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given: an improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increases is observed. It is also shown that including stochastic parameterisation in the low resolution runs helps to improve some aspects of the tropical climate – specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).


2016 ◽  
Author(s):  
R. J. Haarsma ◽  
M. Roberts ◽  
P. L. Vidale ◽  
C. A. Senior ◽  
A. Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest the possibility for significant changes in both large-scale aspects of circulation, as well as improvements in small-scale processes and extremes. However, such high resolution global simulations at climate time scales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centers and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other MIPs. Increases in High Performance Computing (HPC) resources, as well as the revised experimental design for CMIP6, now enables a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility to extend to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulation. HighResMIP thereby focuses on one of the CMIP6 broad questions: “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


2006 ◽  
Vol 7 (1) ◽  
pp. 61-80 ◽  
Author(s):  
B. Decharme ◽  
H. Douville ◽  
A. Boone ◽  
F. Habets ◽  
J. Noilhan

Abstract This study focuses on the influence of an exponential profile of saturated hydraulic conductivity, ksat, with soil depth on the water budget simulated by the Interaction Soil Biosphere Atmosphere (ISBA) land surface model over the French Rhône River basin. With this exponential profile, the saturated hydraulic conductivity at the surface increases by approximately a factor of 10, and its mean value increases in the root zone and decreases in the deeper region of the soil in comparison with the values given by Clapp and Hornberger. This new version of ISBA is compared to the original version in offline simulations using the Rhône-Aggregation high-resolution database. Low-resolution simulations, where all atmospheric data and surface parameters have been aggregated, are also performed to test the impact of the modified ksat profile at the typical scale of a climate model. The simulated discharges are compared to observations from a dense network consisting of 88 gauging stations. Results of the high-resolution experiments show that the exponential profile of ksat globally improves the simulated discharges and that the assumption of an increase in saturated hydraulic conductivity from the soil surface to a depth close to the rooting depth in comparison with values given by Clapp and Hornberger is reasonable. Results of the scaling experiments indicate that this parameterization is also suitable for large-scale hydrological applications. Nevertheless, low-resolution simulations with both model versions overestimate evapotranspiration (especially from the plant transpiration and the wet fraction of the canopy) to the detriment of total runoff, which emphasizes the need for implementing subgrid distribution of precipitation and land surface properties in large-scale hydrological applications.


2019 ◽  
Vol 219 (Supplement_1) ◽  
pp. S137-S151 ◽  
Author(s):  
Julien Aubert

SUMMARY The geodynamo features a broad separation between the large scale at which Earth’s magnetic field is sustained against ohmic dissipation and the small scales of the turbulent and electrically conducting underlying fluid flow in the outer core. Here, the properties of this scale separation are analysed using high-resolution numerical simulations that approach closer to Earth’s core conditions than earlier models. The new simulations are obtained by increasing the resolution and gradually relaxing the hyperdiffusive approximation of previously published low-resolution cases. This upsizing process does not perturb the previously obtained large-scale, leading-order quasi-geostrophic (QG) and first-order magneto-Archimedes-Coriolis (MAC) force balances. As a result, upsizing causes only weak transients typically lasting a fraction of a convective overturn time, thereby demonstrating the efficiency of this approach to reach extreme conditions at reduced computational cost. As Earth’s core conditions are approached in the upsized simulations, Ohmic losses dissipate up to 97 per cent of the injected convective power. Kinetic energy spectra feature a gradually broadening self-similar, power-law spectral range extending over more than a decade in length scale. In this range, the spectral energy density profile of vorticity is shown to be approximately flat between the large scale at which the magnetic field draws its energy from convection through the QG-MAC force balance and the small scale at which this energy is dissipated. The resulting velocity and density anomaly planforms in the physical space consist in large-scale columnar sheets and plumes, respectively, co-existing with small-scale vorticity filaments and density anomaly ramifications. In contrast, magnetic field planforms keep their large-scale structure after upsizing. The small-scale vorticity filaments are aligned with the large-scale magnetic field lines, thereby minimizing the dynamical influence of the Lorentz force. The diagnostic outputs of the upsized simulations are more consistent with the asymptotic QG-MAC theory than those of the low-resolution cases that they originate from, but still feature small residual deviations that may call for further theoretical refinements to account for the structuring constraints of the magnetic field on the flow.


2016 ◽  
Vol 9 (11) ◽  
pp. 4185-4208 ◽  
Author(s):  
Reindert J. Haarsma ◽  
Malcolm J. Roberts ◽  
Pier Luigi Vidale ◽  
Catherine A. Senior ◽  
Alessio Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


Author(s):  
Y. Wang ◽  
B. Wu

The surface slopes of planetary bodies are important factors for exploration missions, such as landing site selection and rover manoeuvre. Generally, high-resolution digital elevation models (DEMs) such as those generated from the HiRISE images on Mars are preferred to generate detailed slopes with a better fidelity of terrain features. Unfortunately, high-resolution datasets normally only cover small area and are not always available. While lower resolution datasets, such as MOLA, provide global coverage of the Martian surface. Slopes generated from the low-resolution DEM will be based on a large baseline and be smoothed from the real situation. In order to carry out slope analysis at large scale on Martian surface based low-resolution data such as MOLA data, while alleviating the smoothness problem of slopes due to its low resolution, this paper presents an amplifying function of slopes derived from low-resolution DEMs based on the relationships between DEM resolutions and slopes. First, slope maps are derived from the HiRISE DEM (meter-level resolution DEM generated from HiRISE images) and a series of down-sampled HiRISE DEMs. The latter are used to simulate low-resolution DEMs. Then the high-resolution slope map is down- sampled to the same resolution with the slope map from the lower-resolution DEMs. Thus, a comparison can be conducted pixel-wise. For each pixel on the slope map derived from the lower-resolution DEM, it can reach the same value with the down-sampled HiRISE slope by multiplying an amplifying factor. Seven sets of HiRISE images with representative terrain types are used for correlation analysis. It shows that the relationship between the amplifying factors and the original MOLA slopes can be described by the exponential function. Verifications using other datasets show that after applying the proposed amplifying function, the updated slope maps give better representations of slopes on Martian surface compared with the original slopes.


Author(s):  
Seung-Kyum Choi ◽  
Mervyn Fathianathan ◽  
Dirk Schaefer

The advances in information technology significantly impact the engineering design process. The primary objective of this research is to develop a novel probabilistic decision support tool to assist management of structural systems under risk and uncertainty by utilizing a stochastic optimization procedure and IT tools. The proposed mathematical and computational framework will overcome the drawbacks of the traditional methods and will be critically demonstrated through large-scale structural problems. The efficiency of the proposed procedure is achieved by the combination of the Karhunen-Loeve transform with the stochastic analysis of polynomial chaos expansion to common optimization procedures. The proposed technology, comprising new and adapted current capabilities, will provide robust and physically reasonable solutions for practical engineering problems.


Author(s):  
Aleksandra Krstikj ◽  
Moisés Gerardo Contreras Ruiz Esparza ◽  
Jaime Mora Vargas ◽  
Laura Hervert Escobar ◽  
Cecilia López de la Rosa ◽  
...  

Author(s):  
Adam Mubeen ◽  
Laddaporn Ruangpan ◽  
Zoran Vojinovic ◽  
Arlex Sanchez Torrez ◽  
Jasna Plavšić

AbstractAdverse effects of climate change are increasing around the world and the floods are posing significant challenges for water managers. With climate projections showing increased risks of storms and extreme precipitation, the use of traditional measures alone is no longer an option. Nature-Based Solutions (NBS) offer a suitable alternative to reduce the risk of flooding and provide multiple benefits. However, planning such interventions requires careful consideration of various factors and local contexts. The present paper provides contribution in this direction and it proposes a methodology for allocation of large-scale NBS using suitability mapping. The methodology was implemented within the toolboxes of ESRI ArcMap software in order to map suitability for four types of NBS interventions: floodplain restoration, detention basins, retention ponds, and river widening. The toolboxes developed were applied to the case study area in Serbia, i.e., the Tamnava River basin. Flood maps were used to determine the volume of floodwater that needs to be stored for reducing flood risk in the basin and subsequent downstream areas. The suitability maps produced indicate the potential of the new methodology and its application as a decision-support tool for selection and allocation of large-scale NBS.


Sign in / Sign up

Export Citation Format

Share Document