monte carlo techniques
Recently Published Documents


TOTAL DOCUMENTS

556
(FIVE YEARS 72)

H-INDEX

43
(FIVE YEARS 4)

2021 ◽  
Vol 9 (12) ◽  
pp. 1322
Author(s):  
Aikaterini P. Kyprioti ◽  
Ehsan Adeli ◽  
Alexandros A. Taflanidis ◽  
Joannes J. Westerink ◽  
Hendrik L. Tolman

During landfalling tropical storms, predictions of the expected storm surge are critical for guiding evacuation and emergency response/preparedness decisions, both at regional and national levels. Forecast errors related to storm track, intensity, and size impact these predictions and, thus, should be explicitly accounted for. The Probabilistic tropical storm Surge (P-Surge) model is the established approach from the National Weather Service (NWS) to achieve this objective. Historical forecast errors are utilized to specify probability distribution functions for different storm features, quantifying, ultimately, the uncertainty in the National Hurricane Center advisories. Surge statistics are estimated by using the predictions across a storm ensemble generated by sampling features from the aforementioned probability distribution functions. P-Surge relies, currently, on a full factorial sampling scheme to create this storm ensemble, combining representative values for each of the storm features. This work investigates an alternative formulation that can be viewed as a seamless extension to the current NHC framework, adopting a quasi-Monte Carlo (QMC) sampling implementation with ultimate goal to reduce the computational burden and provide surge predictions with the same degree of statistical reliability, while using a smaller number of sample storms. The definition of forecast errors adopted here directly follows published NWS practices, while different uncertainty levels are considered in the examined case studies, in order to offer a comprehensive validation. This validation, considering different historical storms, clearly demonstrates the advantages QMC can offer.


Metrologia ◽  
2021 ◽  
Author(s):  
Manuel Marschall ◽  
Gerd Wuebbeler ◽  
Clemens Elster

Abstract Supplement 1 to the GUM (GUM-S1) extends the GUM uncertainty framework to nonlinear functions and non-Gaussian distributions. For this purpose, it employs a Monte Carlo method that yields a probability density function for the measurand. This Monte Carlo method has been successfully applied in numerous applications throughout metrology. However, considerable criticism has been raised against the type A uncertainty evaluation of GUM-S1. Most of the criticism could be addressed by including prior information about the measurand which, however, is beyond the scope of GUM-S1. We propose an alternative Monte Carlo method that will allow prior information about the measurand to be included. The proposed method is based on a Bayesian uncertainty evaluation and applies a simple rejection sampling approach using the Monte Carlo techniques of GUM-S1. The range of applicability of the approach is explored theoretically and in terms of examples. The results are promising, leading us to conclude that many metrological applications could benefit from this approach. Software support is provided to ease its implementation.


2021 ◽  
Vol 922 (1) ◽  
pp. 40
Author(s):  
Fani Dosopoulou ◽  
Jenny E. Greene ◽  
Chung-Pei Ma

Abstract The binding energy liberated by the coalescence of supermassive black hole (SMBH) binaries during galaxy mergers is thought to be responsible for the low density cores often found in bright elliptical galaxies. We use high-resolution N-body and Monte Carlo techniques to perform single and multistage galaxy merger simulations and systematically study the dependence of the central galaxy properties on the binary mass ratio, the slope of the initial density cusps, and the number of mergers experienced. We study both the amount of depleted stellar mass (or mass deficit), M def, and the radial extent of the depleted region, r b. We find that r b ≃ r SOI and that M def varies in the range of 0.5–4M •, with r SOI the influence radius of the remnant SMBH and M • its mass. The coefficients in these relations depend weakly on the binary mass ratio and remain remarkably constant through subsequent mergers. We conclude that the core size and mass deficit do not scale linearly with the number of mergers, making it hard to infer merger histories from observations. On the other hand, we show that both M def and r b are sensitive to the morphology of the galaxy merger remnant, and that adopting spherical initial conditions, as done in early work, leads to misleading results. Our models reproduce the range of values for M def found in most observational work, but span nearly an order-of magnitude range around the true ejected stellar mass.


2021 ◽  
pp. 77-89
Author(s):  
Åsa Carlsson Tedgren ◽  
Rowan M. Thomson ◽  
Guillaume Landry ◽  
Gabriel Fonseca ◽  
Brigitte Reniers ◽  
...  

2021 ◽  
Vol 13 (18) ◽  
pp. 10098
Author(s):  
César Berna-Escriche ◽  
Ángel Pérez-Navarro ◽  
Alberto Escrivá ◽  
Elías Hurtado ◽  
José Luis Muñoz-Cobo ◽  
...  

This study presents a new methodology, based on Monte-Carlo techniques to evaluate the reliability of a carbon-free electricity generation system based on renewable sources; it uses as inputs the variation of the electricity demand and the fluctuations in the renewable supply and provides the renewable system to be installed to guarantee a specific supply reliability level. Additionally, looking for a reduction of this renewable system, the methodology determines the improvements by the incorporation of nuclear power and electricity storage. The methodology is of general application, its implementation being possible under different contexts, such as different time horizons and different future energy scenarios, both for developing, emerging, and developed countries. The only requirement is to have a sufficient database from which to make predictions for future scenarios of electrical generation–demand balances. As an example of practical implementation, the electrical system reliability for the particular case of Spain in 2040 has been forecasted. When considering the fluctuations in solar and wind power contributions, very high values of the installed power from these renewable sources are needed to reach a high reliability of the system. These values decrease substantially if contributions from nuclear and storage technologies are included.


2021 ◽  
Vol 9 (3) ◽  
pp. 287-292
Author(s):  
Nur-Mammadova Nigar

Purpose: This article examines the uncertain conditions of the oil industry in and post crisis period in Azerbaijan Republic in 2015-2017 and during COVID-19 in 2020. Design/Methodology: Existing turmoil in the global economy and volatile oil prices affected the economies of the oil-producing countries. Such instability also affected the economy of Azerbaijan Republic and led to a decrease in its key economic indicators, which leads to apply specific methods for identifying risks and estimating its valuation. Findings: The reasons of the sharp drop in oil and oil products demand have been analysed, using the Monte Carlo techniques and with its impact on Azerbaijan Republic as an example. Practical implications: The author, compiling the forecast for the chosen methodology, clarified forecast for 2018 of oil price of Azerbaijani oil brand Azeri Light in an uncertain environment, and made forecast for coming years, based on these calculations.


2021 ◽  
Vol 263 (2) ◽  
pp. 4672-4682
Author(s):  
Mathias Hinz ◽  
Júlio Apolinário Cordioli ◽  
Luca Alimonti ◽  
Bryce Gardner

Using Statistical Energy Analysis (SEA) to characterize the power flow within a vibroacoustic system is a challenging task when the subsystems have irregular shape and complex construction. Retrieving analytical solutions for the ordinary SEA parameters is nearly impractical without restricting simplifications and periodicity is usually not exploitable due to the lack of repetition patterns. A promising option to perform the power balance for such cases is to filter part of the information contained in a Finite Element Method (FEM) model of the system, in order to convert it into a SEA model. In this paper, the Lorentzian Frequency Average and the Nonparametric Random Matrix Theory are applied to randomize the dynamic stiffness matrix of the FEM components from a system of industrial application. The obtained direct field dynamic stiffness matrices are employed along the diffuse field reciprocity relationship as a general framework to determine the energetic content of each component. The results obtained with this procedure are evaluated against the ones from classical SEA and Monte Carlo techniques.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 902
Author(s):  
Xianghao Hou ◽  
Jianbo Zhou ◽  
Yixin Yang ◽  
Long Yang ◽  
Gang Qiao

Accurate 3D passive tracking of an underwater uncooperative target is of great significance to make use of the sea resources as well as to ensure the safety of our maritime areas. In this paper, a 3D passive underwater uncooperative target tracking problem for a time-varying non-Gaussian environment is studied. Aiming to overcome the low observability drawback inherent in the passive target tracking problem, a distributed passive underwater buoys observing system is considered and the optimal topology of the distributed measurement system is designed based on the nonlinear system observability analysis theory and the Cramer–Rao lower bound (CRLB) analysis method. Then, considering the unknown underwater environment will lead to time-varying non-Gaussian disturbances for both the target’s dynamics and the measurements, the robust optimal nonlinear estimator, namely the adaptive particle filter (APF), is proposed. Based on the Bayesian posterior probability and Monte Carlo techniques, the proposed algorithm utilizes the real-time optimal estimation technique to calculate the complex noise online and tackle the underwater uncooperative target tracking problem. Finally, the proposed algorithm is tested by simulated data and comprehensive comparisons along with detailed discussions that are made to demonstrate the effectiveness of the proposed APF.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Hanan Haj Ahmad

This paper describes two prediction methods for predicting the non-observed (censored) units under progressive Type-II censored samples. The lifetimes under consideration are following a new two-parameter Pareto distribution. Furthermore, point and interval estimation of the unknown parameters of the new Pareto model is obtained. Maximum likelihood and Bayesian estimation methods are considered for that purpose. Since Bayes estimators cannot be expressed explicitly, Gibbs and the Markov Chain Monte Carlo techniques are utilized for Bayesian calculation. We use the posterior predictive density of the non-observed units to construct predictive intervals. A simulation study is performed to evaluate the performance of the estimators via mean square errors and biases and to obtain the best prediction method for the censored observation under progressive Type-II censoring scheme for different sample sizes and different censoring schemes.


Sign in / Sign up

Export Citation Format

Share Document