scholarly journals Goal-oriented adaptive sampling under random field modelling of response probability distributions

2021 ◽  
Vol 71 ◽  
pp. 89-100
Author(s):  
Athénaïs Gautier ◽  
David Ginsbourger ◽  
Guillaume Pirot

In the study of natural and artificial complex systems, responses that are not completely determined by the considered decision variables are commonly modelled probabilistically, resulting in response distributions varying across decision space. We consider cases where the spatial variation of these response distributions does not only concern their mean and/or variance but also other features including for instance shape or uni-modality versus multi-modality. Our contributions build upon a non-parametric Bayesian approach to modelling the thereby induced fields of probability distributions, and in particular to a spatial extension of the logistic Gaussian model. The considered models deliver probabilistic predictions of response distributions at candidate points, allowing for instance to perform (approximate) posterior simulations of probability density functions, to jointly predict multiple moments and other functionals of target distributions, as well as to quantify the impact of collecting new samples on the state of knowledge of the distribution field of interest. In particular, we introduce adaptive sampling strategies leveraging the potential of the considered random distribution field models to guide system evaluations in a goal-oriented way, with a view towards parsimoniously addressing calibration and related problems from non-linear (stochastic) inversion and global optimisation.

1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


Author(s):  
Abigail R. Wooldridge ◽  
Rod D. Roscoe ◽  
Rod D. Roscoe ◽  
Shannon C. Roberts ◽  
Rupa Valdez ◽  
...  

The Diversity Committee of HFES has led sessions at the Annual Meeting for the past three years focused on improving diversity, equity and inclusion in the society as well as providing support to human factors and ergonomics (HF/E) researchers and practitioners who aim to apply HF/E knowledge and principles to improve diversity, equity and inclusion through their work. In this panel, we bring together researchers actively engaged in designing technology and systems by considering issues of diversity, equity and inclusion to share insights and methods. Topics include the thoughtful design of sampling strategies and research approaches, alternative and participatory methods to understand the impact of automation and technology on equity, scoping design problems to be inclusive and equitable through interdisciplinary partnerships, and the application of sociotechnical system design and team science to develop interdisciplinary teams. By sharing our experiences, we hope to prepare others to successfully approach these topics.


Author(s):  
Lu Chen ◽  
Handing Wang ◽  
Wenping Ma

AbstractReal-world optimization applications in complex systems always contain multiple factors to be optimized, which can be formulated as multi-objective optimization problems. These problems have been solved by many evolutionary algorithms like MOEA/D, NSGA-III, and KnEA. However, when the numbers of decision variables and objectives increase, the computation costs of those mentioned algorithms will be unaffordable. To reduce such high computation cost on large-scale many-objective optimization problems, we proposed a two-stage framework. The first stage of the proposed algorithm combines with a multi-tasking optimization strategy and a bi-directional search strategy, where the original problem is reformulated as a multi-tasking optimization problem in the decision space to enhance the convergence. To improve the diversity, in the second stage, the proposed algorithm applies multi-tasking optimization to a number of sub-problems based on reference points in the objective space. In this paper, to show the effectiveness of the proposed algorithm, we test the algorithm on the DTLZ and LSMOP problems and compare it with existing algorithms, and it outperforms other compared algorithms in most cases and shows disadvantage on both convergence and diversity.


Atmosphere ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 679
Author(s):  
Sara Cornejo-Bueno ◽  
David Casillas-Pérez ◽  
Laura Cornejo-Bueno ◽  
Mihaela I. Chidean ◽  
Antonio J. Caamaño ◽  
...  

This work presents a full statistical analysis and accurate prediction of low-visibility events due to fog, at the A-8 motor-road in Mondoñedo (Galicia, Spain). The present analysis covers two years of study, considering visibility time series and exogenous variables collected in the zone affected the most by extreme low-visibility events. This paper has then a two-fold objective: first, we carry out a statistical analysis for estimating the fittest probability distributions to the fog event duration, using the Maximum Likelihood method and an alternative method known as the L-moments method. This statistical study allows association of the low-visibility depth with the event duration, showing a clear relationship, which can be modeled with distributions for extremes such as Generalized Extreme Value and Generalized Pareto distributions. Second, we apply a neural network approach, trained by means of the ELM (Extreme Learning Machine) algorithm, to predict the occurrence of low-visibility events due to fog, from atmospheric predictive variables. This study provides a full characterization of fog events at this motor-road, in which orographic fog is predominant, causing important traffic problems during all year. We also show how the ELM approach is able to obtain highly accurate low-visibility events predictions, with a Pearson correlation coefficient of 0.8, within a half-hour time horizon, enough to initialize some protocols aiming at reducing the impact of these extreme events in the traffic of the A-8 motor road.


2014 ◽  
Vol 505-506 ◽  
pp. 645-649
Author(s):  
Yu Wang

Traditional methods for determining airline fleet composition could not reflect the impact of network effects on fleet composition. To solve this problem for airlines operating in the mode of Hub & Spoke network, the passenger mix problem was incorporated into the model of determining airline fleet composition. The purchasing number of aircrafts in each fleet type, the frequencies of each aircraft type flying on legs and the spilling number of passengers from each itinerary were treated as decision variables. The limitations including maximum flying frequencies on each leg, available flying time each fleet type can provide and maximum passengers spilled from each flight leg were considered as constraints. A model to minimize the fleet planning cost was constructed. The numerical example shows that the fleet planning cost derived from this proposed model is 46266381.64 Yuan and reduces by 3914969.70 Yuan compared to the result from the traditional leg-based model. In hence, this proposed model is effective and feasible.


2021 ◽  
Author(s):  
Carlo Cristiano Stabile ◽  
Marco Barbiero ◽  
Giorgio Fighera ◽  
Laura Dovera

Abstract Optimizing well locations for a green field is critical to mitigate development risks. Performing such workflows with reservoir simulations is very challenging due to the huge computational cost. Proxy models can instead provide accurate estimates at a fraction of the computing time. This study presents an application of new generation functional proxies to optimize the well locations in a real oil field with respect to the actualized oil production on all the different geological realizations. Proxies are built with the Universal Trace Kriging and are functional in time allowing to actualize oil flows over the asset lifetime. Proxies are trained on the reservoir simulations using randomly sampled well locations. Two proxies are created for a pessimistic model (P10) and a mid-case model (P50) to capture the geological uncertainties. The optimization step uses the Non-dominated Sorting Genetic Algorithm, with discounted oil productions of the two proxies, as objective functions. An adaptive approach was employed: optimized points found from a first optimization were used to re-train the proxy models and a second run of optimization was performed. The methodology was applied on a real oil reservoir to optimize the location of four vertical production wells and compared against reference locations. 111 geological realizations were available, in which one relevant uncertainty is the presence of possible compartments. The decision space represented by the horizontal translation vectors for each well was sampled using Plackett-Burman and Latin-Hypercube designs. A first application produced a proxy with poor predictive quality. Redrawing the areas to avoid overlaps and to confine the decision space of each well in one compartment, improved the quality. This suggests that the proxy predictive ability deteriorates in presence of highly non-linear responses caused by sealing faults or by well interchanging positions. We then followed a 2-step adaptive approach: a first optimization was performed and the resulting Pareto front was validated with reservoir simulations; to further improve the proxy quality in this region of the decision space, the validated Pareto front points were added to the initial dataset to retrain the proxy and consequently rerun the optimization. The final well locations were validated on all 111 realizations with reservoir simulations and resulted in an overall increase of the discounted production of about 5% compared to the reference development strategy. The adaptive approach, combined with functional proxy, proved to be successful in improving the workflow by purposefully increasing the training set samples with data points able to enhance the optimization step effectiveness. Each optimization run performed relies on about 1 million proxy evaluations which required negligible computational time. The same workflow carried out with standard reservoir simulations would have been practically unfeasible.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 477 ◽  
Author(s):  
Roman Baravalle ◽  
Fernando Montani

A major challenge in neuroscience is to understand the role of the higher-order correlations structure of neuronal populations. The dichotomized Gaussian model (DG) generates spike trains by means of thresholding a multivariate Gaussian random variable. The DG inputs are Gaussian distributed, and thus have no interactions beyond the second order in their inputs; however, they can induce higher-order correlations in the outputs. We propose a combination of analytical and numerical techniques to estimate higher-order, above the second, cumulants of the firing probability distributions. Our findings show that a large amount of pairwise interactions in the inputs can induce the system into two possible regimes, one with low activity (“DOWN state”) and another one with high activity (“UP state”), and the appearance of these states is due to a combination between the third- and fourth-order cumulant. This could be part of a mechanism that would help the neural code to upgrade specific information about the stimuli, motivating us to examine the behavior of the critical fluctuations through the Binder cumulant close to the critical point. We show, using the Binder cumulant, that higher-order correlations in the outputs generate a critical neural system that portrays a second-order phase transition.


2017 ◽  
Vol 17 (11) ◽  
pp. 2017-2039 ◽  
Author(s):  
Alessandro Valentini ◽  
Francesco Visini ◽  
Bruno Pace

Abstract. Italy is one of the most seismically active countries in Europe. Moderate to strong earthquakes, with magnitudes of up to ∼ 7, have been historically recorded for many active faults. Currently, probabilistic seismic hazard assessments in Italy are mainly based on area source models, in which seismicity is modelled using a number of seismotectonic zones and the occurrence of earthquakes is assumed uniform. However, in the past decade, efforts have increasingly been directed towards using fault sources in seismic hazard models to obtain more detailed and potentially more realistic patterns of ground motion. In our model, we used two categories of earthquake sources. The first involves active faults, and using geological slip rates to quantify the seismic activity rate. We produced an inventory of all fault sources with details of their geometric, kinematic, and energetic properties. The associated parameters were used to compute the total seismic moment rate of each fault. We evaluated the magnitude–frequency distribution (MFD) of each fault source using two models: a characteristic Gaussian model centred at the maximum magnitude and a truncated Gutenberg–Richter model. The second earthquake source category involves grid-point seismicity, with a fixed-radius smoothed approach and a historical catalogue were used to evaluate seismic activity. Under the assumption that deformation is concentrated along faults, we combined the MFD derived from the geometry and slip rates of active faults with the MFD from the spatially smoothed earthquake sources and assumed that the smoothed seismic activity in the vicinity of an active fault gradually decreases by a fault-size-driven factor. Additionally, we computed horizontal peak ground acceleration (PGA) maps for return periods of 475 and 2475 years. Although the ranges and gross spatial distributions of the expected accelerations obtained here are comparable to those obtained through methods involving seismic catalogues and classical zonation models, the spatial pattern of the hazard maps obtained with our model is far more detailed. Our model is characterized by areas that are more hazardous and that correspond to mapped active faults, while previous models yield expected accelerations that are almost uniformly distributed across large regions. In addition, we conducted sensitivity tests to determine the impact on the hazard results of the earthquake rates derived from two MFD models for faults and to determine the relative contributions of faults versus distributed seismic activity. We believe that our model represents advancements in terms of the input data (quantity and quality) and methodology used in the field of fault-based regional seismic hazard modelling in Italy.


Aerospace ◽  
2018 ◽  
Vol 5 (4) ◽  
pp. 109 ◽  
Author(s):  
Michael Schultz ◽  
Sandro Lorenz ◽  
Reinhard Schmitz ◽  
Luis Delgado

Weather events have a significant impact on airport performance and cause delayed operations if the airport capacity is constrained. We provide quantification of the individual airport performance with regards to an aggregated weather-performance metric. Specific weather phenomena are categorized by the air traffic management airport performance weather algorithm, which aims to quantify weather conditions at airports based on aviation routine meteorological reports. Our results are computed from a data set of 20.5 million European flights of 2013 and local weather data. A methodology is presented to evaluate the impact of weather events on the airport performance and to select the appropriate threshold for significant weather conditions. To provide an efficient method to capture the impact of weather, we modelled departing and arrival delays with probability distributions, which depend on airport size and meteorological impacts. These derived airport performance scores could be used in comprehensive air traffic network simulations to evaluate the network impact caused by weather induced local performance deterioration.


Sign in / Sign up

Export Citation Format

Share Document