Determining signal entropy in uncertainty space

Measurement ◽  
2021 ◽  
Vol 178 ◽  
pp. 109336
Author(s):  
R. Craig Herndon
Keyword(s):  
Author(s):  
Seyed Kourosh Mahjour ◽  
Antonio Alberto Souza Santos ◽  
Manuel Gomes Correia ◽  
Denis José Schiozer

AbstractThe simulation process under uncertainty needs numerous reservoir models that can be very time-consuming. Hence, selecting representative models (RMs) that show the uncertainty space of the full ensemble is required. In this work, we compare two scenario reduction techniques: (1) Distance-based Clustering with Simple Matching Coefficient (DCSMC) applied before the simulation process using reservoir static data, and (2) metaheuristic algorithm (RMFinder technique) applied after the simulation process using reservoir dynamic data. We use these two methods as samples to investigate the effect of static and dynamic data usage on the accuracy and rate of the scenario reduction process focusing field development purposes. In this work, a synthetic benchmark case named UNISIM-II-D considering the flow unit modelling is used. The results showed both scenario reduction methods are reliable in selecting the RMs from a specific production strategy. However, the obtained RMs from a defined strategy using the DCSMC method can be applied to other strategies preserving the representativeness of the models, while the role of the strategy types to select the RMs using the metaheuristic method is substantial so that each strategy has its own set of RMs. Due to the field development workflow in which the metaheuristic algorithm is used, the number of required flow simulation models and the computational time are greater than the workflow in which the DCSMC method is applied. Hence, it can be concluded that static reservoir data usage on the scenario reduction process can be more reliable during the field development phase.


Author(s):  
Juan Luis Fernández-Martínez ◽  
Ana Cernea

In this paper, we present a supervised ensemble learning algorithm, called SCAV1, and its application to face recognition. This algorithm exploits the uncertainty space of the ensemble classifiers. Its design includes six different nearest-neighbor (NN) classifiers that are based on different and diverse image attributes: histogram, variogram, texture analysis, edges, bidimensional discrete wavelet transform and Zernike moments. In this approach each attribute, together with its corresponding type of the analysis (local or global), and the distance criterion (p-norm) induces a different individual NN classifier. The ensemble classifier SCAV1 depends on a set of parameters: the number of candidate images used by each individual method to perform the final classification and the individual weights given to each individual classifier. SCAV1 parameters are optimized/sampled using a supervised approach via the regressive particle swarm optimization algorithm (RR-PSO). The final classifier exploits the uncertainty space of SCAV1 and uses majority voting (Borda Count) as a final decision rule. We show the application of this algorithm to the ORL and PUT image databases, obtaining very high and stable accuracies (100% median accuracy and almost null interquartile range). In conclusion, exploring the uncertainty space of ensemble classifiers provides optimum results and seems to be the appropriate strategy to adopt for face recognition and other classification problems.


2021 ◽  
Vol 14 (03) ◽  
Author(s):  
Kai Yao

Uncertain processes are used to model dynamic indeterminate systems associated with human uncertainty, and uncertain independent increment processes are a type of uncertain processes with independent uncertain increments. This paper mainly verifies a basic property about the sample paths of uncertain independent increment processes, which states that uncertain independent increment processes defined on a continuous uncertainty space are contour processes, a type of uncertain processes with a spectrum of sample paths as the skeletons. Based on this property, the extreme values and the time integral of an uncertain independent increment process are investigated, and their inverse uncertainty distributions are obtained.


Author(s):  
Ning-Cong Xiao ◽  
Libin Duan ◽  
Zhangchun Tang

Calculating probability of failure and reliability sensitivity for a structural system with dependent truncated random variables and multiple failure modes efficiently is a challenge mainly due to the complicated features and intersections for the multiple failure modes, as well as the correlated performance functions. In this article, a new surrogate-model-based reliability method is proposed for structural systems with dependent truncated random variables and multiple failure modes. Copula functions are used to model the correlation for truncated random variables. A small size of uniformly distribution samples in the supported intervals is generated to cover the entire uncertainty space fully and properly. An accurate surrogate model is constructed based on the proposed training points and support vector machines to approximate the relationships between the inputs and system responses accurately for almost the entire uncertainty space. The approaches to calculate probability of failure and reliability sensitivity for structural systems with truncated random variables and multiple failure modes based on the constructed surrogate model are derived. The accuracy and efficiency of the proposed method are demonstrated using two numerical examples.


Author(s):  
Hany S. Abdel-Khalik ◽  
Dongli Huang ◽  
Ondrej Chvala ◽  
G. Ivan Maldonado

Uncertainty quantification is an indispensable analysis for nuclear reactor simulation as it provides a rigorous approach by which the credibility of the predictions can be assessed. Focusing on propagation of multi-group cross-sections, the major challenge lies in the enormous size of the uncertainty space. Earlier work has explored the use of the physics-guided coverage mapping (PCM) methodology to assess the quality of the assumptions typically employed to reduce the size of the uncertainty space. A reduced order modeling (ROM) approach has been further developed to identify the active degrees of freedom (DOFs) of the uncertainty space, comprising all the cross-section few-group parameters required in core-wide simulation. In the current work, a sensitivity study, based on the PCM and ROM results, is applied to identify a suitable compressed representation of the uncertainty space to render feasible the quantification and prioritization of the various sources of uncertainties. While the proposed developments are general to any reactor physics computational sequence, the proposed approach is customized to the TRITON-NESTLE computational sequence, simulating the BWR lattice model and the core model, which will serve as a demonstrative tool for the implementation of the algorithms.


2019 ◽  
Vol 1180 ◽  
pp. 012003
Author(s):  
E Aujero ◽  
M Frondoza ◽  
E De Lara-Tuprio ◽  
R Eden ◽  
T R Teng

2015 ◽  
Vol 06 (02) ◽  
pp. 1550008 ◽  
Author(s):  
CÉLINE GUIVARCH ◽  
STÉPHANIE MONJON ◽  
JULIE ROZENBERG ◽  
ADRIEN VOGT-SCHILB

Energy security improvement is often presented as a co-benefit of climate policies. This paper evaluates this claim. It investigates whether climate policy would improve energy security, while accounting for the difficulties entailed by the many-faceted nature of the concept and the large uncertainties on the determinants of future energy systems. A multi-dimension analysis grid is used to capture the energy security concept, and a database of scenarios allows us to explore the uncertainty space. The results, focusing on Europe, reveal there is no unequivocal effect of climate policy on all the perspectives of energy security. Moreover, time significantly matters: the impact of climate policies is mixed in the short term and globally good in the medium term. In the long term, there is a risk of degradation of the energy security. Lastly, we examine the robustness of our results to uncertainties on drivers of economic growth, availability of fossil fuels and the potentials and low-carbon technologies, and find that they are sensitive mainly to fossil fuels availability, low carbon technologies in the energy sector and improvements in energy efficiency.


2021 ◽  
Author(s):  
Lea Beusch ◽  
Zebedee Nicholls ◽  
Lukas Gudmundsson ◽  
Mathias Hauser ◽  
Malte Meinshausen ◽  
...  

Abstract. Producing targeted climate information at the local scale, including major sources of climate change projection uncertainty for diverse emissions scenarios, is essential to support climate change mitigation and adaptation efforts. Here, we present the first chain of computationally efficient Earth System Model (ESM) emulators allowing to rapidly translate greenhouse gas emission pathways into spatially resolved annual-mean temperature anomaly field time series, accounting for both forced climate response and natural variability uncertainty at the local scale. By combining the global-mean, emissions-driven emulator MAGICC with the spatially resolved emulator MESMER, ESM-specific as well as constrained probabilistic emulated ensembles can be derived. This emulation chain can hence build on and extend large multi-ESM ensembles such as the ones produced within the 6th phase of the Coupled Model Intercomparison Project (CMIP6). The main extensions are threefold. (i) A more thorough sampling of the forced climate response and the natural variability uncertainty is possible with millions of emulated realizations being readily created. (ii) The same uncertainty space can be sampled for any emission pathway, which is not the case in CMIP6, where some of the most societally relevant strong mitigation scenarios have been run by only a small number of ESMs. (iii) Other lines of evidence to constrain future projections, including observational constraints, can be introduced, which helps to refine projected future ranges beyond the multi-ESM ensemble's estimates. In addition to presenting results from the coupled MAGICC-MESMER emulator chain, we carry out an extensive validation of MESMER, which is trained on and applied to multiple emission pathways for the first time in this study. The newly developed MAGICC-MESMER coupled emulator will allow unprecedented assessments of the implications of manifold emissions pathways at regional scale.


2011 ◽  
Vol 11 (7) ◽  
pp. 20433-20485 ◽  
Author(s):  
L. A. Lee ◽  
K. S. Carslaw ◽  
K. Pringle ◽  
G. W. Mann ◽  
D. V. Spracklen

Abstract. Sensitivity analysis of atmospheric models is necessary to identify the processes that lead to uncertainty in model predictions, to help understand model diversity, and to prioritise research. Assessing the effect of parameter uncertainty in complex models is challenging and often limited by CPU constraints. Here we present a cost-effective application of variance-based sensitivity analysis to quantify the sensitivity of a 3-D global aerosol model to uncertain parameters. A Gaussian process emulator is used to estimate the model output across multi-dimensional parameter space using information from a small number of model runs at points chosen using a Latin hypercube space-filling design. Gaussian process emulation is a Bayesian approach that uses information from the model runs along with some prior assumptions about the model behaviour to predict model output everywhere in the uncertainty space. We use the Gaussian process emulator to calculate the percentage of expected output variance explained by uncertainty in global aerosol model parameters and their interactions. To demonstrate the technique, we show examples of cloud condensation nuclei (CCN) sensitivity to 8 model parameters in polluted and remote marine environments as a function of altitude. In the polluted environment 95 % of the variance of CCN concentration is described by uncertainty in the 8 parameters (excluding their interaction effects) and is dominated by the uncertainty in the sulphur emissions, which explains 80 % of the variance. However, in the remote region parameter interaction effects become important, accounting for up to 40 % of the total variance. Some parameters are shown to have a negligible individual effect but a substantial interaction effect. Such sensitivities would not be detected in the commonly used single parameter perturbation experiments, which would therefore underpredict total uncertainty. Gaussian process emulation is shown to be an efficient and useful technique for quantifying parameter sensitivity in complex global atmospheric model.


Sign in / Sign up

Export Citation Format

Share Document