IHP: a dynamic heterogeneous parallel scheme for iterative or time-step methods—image denoising as case study

2020 ◽  
Vol 77 (1) ◽  
pp. 95-110
Author(s):  
Ruben Laso ◽  
José C. Cabaleiro ◽  
Francisco F. Rivera ◽  
M. Carmen Muñiz ◽  
José A. Álvarez-Dios
Robotica ◽  
2003 ◽  
Vol 21 (2) ◽  
pp. 153-161 ◽  
Author(s):  
S. Kilicaslan ◽  
Y. Ercan

A method for the time suboptimal control of an industrial manipulator that moves along a specified path while keeping its end-effector orientation unchanged is proposed. Nonlinear system equations that describe the manipulator motion are linearized at each time step along the path. A method which gives control inputs (joint angular velocities) for time suboptimal control of the manipulator is developed. In the formulation, joint angular velocity and acceleration limitations are also taken into consideration. A six degree of freedom elbow type manipulator is used in a case study to verify the method developed.


2008 ◽  
Vol 8 (13) ◽  
pp. 3603-3622 ◽  
Author(s):  
F. Lasserre ◽  
G. Cautenet ◽  
C. Bouet ◽  
X. Dong ◽  
Y. J. Kim ◽  
...  

Abstract. In order to assess the complex mixing of atmospheric anthropogenic and natural pollutants over the East Asian region, we present a modelling tool which takes into account the main aerosols which are to be found simultaneously over China, Korea and Japan during springtime. Using the mesoscale RAMS (Regional Atmospheric Modeling System) tool, we present a simulation of natural (desert) dust events along with some of the most critical anthropogenic pollutants over East Asia, sulphur elements (SO2 and SO2-4) and Black Carbon (BC). As regards a one-week case study of dust events which occurred during late April 2005 over an area extending from the Gobi deserts to the Japan surroundings, we satisfactorily model the behaviours of the different aerosol plumes. We focus on possible dust mixing with the anthropogenic pollutants from megacities. For both natural and anthropogenic pollution, the model results are in fairly good agreement with the horizontal and vertical distributions of concentrations as measured by in situ LIDAR, and as observed in remote data, PM10 data and literature. In particular, we show that a simplified chemistry approach of this complex issue is sufficient to model this event, with a real-time step of 3 h. The model reproduces the main patterns and orders of magnitude for Aerosol Optical Thickness (AOT) and species contributions (via the Angström Exponent) when compared with the AErosol RObotic NETwork (AERONET) data.


1997 ◽  
Vol 36 (6) ◽  
pp. 711-720 ◽  
Author(s):  
Kathrin Baumann ◽  
Andreas Stohl

Abstract In September 1995, 18 gas balloon teams competed at the Gordon Bennett Cup, a long-distance ballooning event. The landing positions, travel times of all teams, and detailed information on the tracks of four teams are available. A special version of the trajectory model FLEXTRA (flexible trajectories) is used that allows the heights of calculated trajectories to be adjusted to the respective balloon heights at every computation time step. The comparison of calculated and observed balloon trajectories allows a validation of the trajectory model. In this case study, the agreement between calculated and balloon trajectories was good, with average relative transport errors of less than 20% of the travel distance after 46 h of travel time. Most of the trajectory errors originate from interpolation errors and from amplifications of small position disturbances in divergent wind fields. Trajectory ensembles, taking into account stochastic errors occurring during the trajectory calculations, are shown to be very reliable in assessing the uncertainties of the computed trajectories. In the present study, the balloon tracks were enveloped by the ensemble trajectories most of the time, suggesting that errors in the analyzed wind fields were relatively small.


2016 ◽  
Vol 31 (1) ◽  
pp. 33-41 ◽  
Author(s):  
Fayçal Djellouli ◽  
Abderrazak Bouanani ◽  
Kamila Baba-Hamed

AbstractDrought is an insidious hazard of nature in many parts of the world. It originates from persistent shortage of precipitation over a specific region for a specific period of time and has a conceptual and operational definition. Drought impact on some activity, group, or environmental sector depends on the extent of water shortage and ground conditions. Algeria and especially the western region has experienced several periods of drought over the last century, since 1975 to the present day. The most recent drought in 1981, 1989, 1990, 1992, 1994 and 1999 was characterized by its intensity and spatial extent. Drought is identified using various drought indices (meteorological, hydrological and agricultural). In this research, we focus on the meteorological drought, to assess the reliability of these indices under changing climatic conditions. Data was recorded for the period of 1980–2009 at wadi Louza catchment (NW-Algeria). For describing and monitoring drought severity periods, we calculated the correlation between both meteorological drought indices: Standardised Precipitation Index (SPI) and Effective Drought Index (EDI). The results show that the watershed of wadi Louza has experienced a severe meteorological drought. The correlation between meteorological drought indices was good for all time steps and the best was found for 9-month time step. The obtained results may provide some scientific support for fighting against droughts.


2011 ◽  
Vol 8 (2) ◽  
pp. 2373-2422 ◽  
Author(s):  
T. Krauße ◽  
J. Cullmann

Abstract. The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. In particular methods which understand the estimation of hydrological model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008) presented a first proposal and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. Krauße and Cullmann (2011) further developed this method and applied it in a case study to calibrate a process oriented hydrological model with hourly time step focussing on flood events in a fast responding catchment. The results of both studies showed the potential of the application of the principle of data depth. However, also the weak point of the presented approach got obvious. The algorithm identifies a set of model parameter vectors with high model performance and subsequently generates a set of parameter vectors with high data depth with respect to the first set. These both steps are repeated iteratively until a stopping criterion is met. In the first step the estimation of the good parameter vectors is based on the Monte Carlo method. The major shortcoming of this method is that it is strongly dependent on a high number of samples exponentially growing with the dimensionality of the problem. In this paper we present another robust parameter estimation strategy which applies an approved search strategy for high-dimensional parameter spaces, the particle swarm optimisation in order to identify a set of good parameter vectors with given uncertainty bounds. The generation of deep parameters is according to Krauße and Cullmann (2011). The method was compared to the Monte Carlo based robust parameter estimation algorithm on the example of a case study in Krauße and Cullmann (2011) to calibrate the process-oriented distributed hydrological model focussing for flood forecasting in a small catchment characterised by extreme process dynamics. In a second case study the comparison is repeated on a problem with higher dimensionality considering further parameters of the soil module.


2020 ◽  
Vol 148 (8) ◽  
pp. 3341-3359
Author(s):  
X. Zheng ◽  
S. A. Klein ◽  
V. P. Ghate ◽  
S. Santos ◽  
J. McGibbon ◽  
...  

Abstract This paper presents a process-oriented evaluation of precipitating stratocumulus and its transition to cumulus in version 1 of the Energy Exascale Earth System Model (E3SMv1) using comprehensive case-study observations from a field campaign of the Atmospheric Radiation Measurement program (ARM). The E3SMv1 single-column model (SCM) of the marine boundary layer and its low clouds and precipitation are compared to observations including subcloud drizzle retrievals from a combination of Doppler radar and lidar backscatter measurements. The SCM is also compared to a large-eddy simulation (LES) of the same case. The combination of advanced remote sensing observations and LES is a powerful framework to evaluate the physical parameterizations of large-scale models. Given the observed large-scale environment, the E3SMv1 SCM realistically represents the evolution of clouds and boundary layer structure during the stratocumulus-to-cumulus transition. The model well simulates the liquid water path and its diurnal cycle in the stratocumulus period as well as the two-layer vertical thermodynamic structure and lower cloud fraction in the transition period. E3SMv1’s success in simulating the cloud in the stratocumulus period permitted examination of its precipitation processes. Here problems were identified with E3SMv1 producing an unrealistically small subcloud precipitation fraction, an unrealistic double peak in the vertical profiles of precipitation mass, and drizzle that evaporates too close to the surface. Further model diagnostics determined that these unrealistic characteristics resulted from an overly long microphysics time step and an unrealistic parameterization of the precipitation fraction. These results imply that careful consideration of these issues is needed in order to better simulate precipitation processes in marine stratocumulus.


2021 ◽  
Vol 2069 (1) ◽  
pp. 012143
Author(s):  
Sorana Ozaki ◽  
Ryozo Ooka ◽  
Shintaro Ikeda

Abstract The operational energy of buildings is making up one of the highest proportions of life-cycle carbon emissions. A more efficient operation of facilities would result in significant energy savings but necessitates computational models to predict a building’s future energy demands with high precision. To this end, various machine learning models have been proposed in recent years. These models’ prediction accuracies, however, strongly depend on their internal structure and hyperparameters. The time demand and expertise required for their finetuning call for a more efficient solution. In the context of a case study, this paper describes the relationship between a machine learning model’s prediction accuracy and its hyperparameters. Based on time-stamped recordings of outdoor temperatures and electricity demands of a hospital in Japan, recorded every 30 minutes for more than four years, using a deep neural network (DNN) ensemble model, electricity demands were predicted for sixty time steps to follow. Specifically, we used automatic hyperparameter tuning methods, such as grid search, random search, and Bayesian optimization. A single time step ahead, all tuning methods reduced the RSME to less than 50%, compared to non-optimized tuning. The results attest to machine learning models’ reliance on hyperparameters and the effectiveness of their automatic tuning.


Water ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 1849 ◽  
Author(s):  
Mahmood Mahmoodian ◽  
Jairo Arturo Torres-Matallana ◽  
Ulrich Leopold ◽  
Georges Schutz ◽  
Francois H. L. R. Clemens

In this study, applicability of a data-driven Gaussian Process Emulator (GPE) technique to develop a dynamic surrogate model for a computationally expensive urban drainage simulator is investigated. Considering rainfall time series as the main driving force is a challenge in this regard due to the high dimensionality problem. However, this problem can be less relevant when the focus is only on short-term simulations. The novelty of this research is the consideration of short-term rainfall time series as training parameters for the GPE. Rainfall intensity at each time step is counted as a separate parameter. A method to generate synthetic rainfall events for GPE training purposes is introduced as well. Here, an emulator is developed to predict the upcoming daily time series of the total wastewater volume in a storage tank and the corresponding Combined Sewer Overflow (CSO) volume. Nash-Sutcliffe Efficiency (NSE) and Volumetric Efficiency (VE) are calculated as emulation error indicators. For the case study herein, the emulator is able to speed up the simulations up to 380 times with a low accuracy cost for prediction of the total storage tank volume (medians of NSE = 0.96 and VE = 0.87). CSO events occurrence is detected in 82% of the cases, although with some considerable accuracy cost (medians of NSE = 0.76 and VE = 0.5). Applicability of the emulator for consecutive short-term simulations, based on real observed rainfall time series is also validated with a high accuracy (NSE = 0.97, VE = 0.89).


Sign in / Sign up

Export Citation Format

Share Document