scholarly journals A simple method for implementing Monte Carlo tests

2019 ◽  
Vol 35 (3) ◽  
pp. 1373-1392 ◽  
Author(s):  
Dong Ding ◽  
Axel Gandy ◽  
Georg Hahn

Abstract We consider a statistical test whose p value can only be approximated using Monte Carlo simulations. We are interested in deciding whether the p value for an observed data set lies above or below a given threshold such as 5%. We want to ensure that the resampling risk, the probability of the (Monte Carlo) decision being different from the true decision, is uniformly bounded. This article introduces a simple open-ended method with this property, the confidence sequence method (CSM). We compare our approach to another algorithm, SIMCTEST, which also guarantees an (asymptotic) uniform bound on the resampling risk, as well as to other Monte Carlo procedures without a uniform bound. CSM is free of tuning parameters and conservative. It has the same theoretical guarantee as SIMCTEST and, in many settings, similar stopping boundaries. As it is much simpler than other methods, CSM is a useful method for practical applications.

2012 ◽  
Vol 39 (1) ◽  
pp. 40-47 ◽  
Author(s):  
Guillaume Guérin ◽  
Norbert Mercier

Abstract The determination of gamma dose rates is of prior importance in the field of luminescence dating methods. In situ measurements are usually performed by the insertion of dosimeters or a portable gamma spectrometer cell in sediments. In this paper, Monte-Carlo simulations using the Geant4 toolkit allow the development of a new technique of insitu gamma dose rate evaluations: a spectrometer cell is placed on the surface of sediments under excavation to acquire successive spectra as sediments are removed by excavations. The principle of this non-invasive technique is outlined and its potential is discussed, especially in the case of environments in which radioelements are heterogeneously distributed. For such cases, a simple method to reconstruct gamma dose rate values with surface measurements using an attenuator is discussed, and an estimation of errors is given for two simple cases. This technique appears to be applicable, but still needs experimental validation.


2021 ◽  
Vol 75 (12) ◽  
Author(s):  
A. García-Abenza ◽  
A. I. Lozano ◽  
L. Álvarez ◽  
J. C. Oller ◽  
F. Blanco ◽  
...  

Abstract A self-consistent data set, with all the necessary inputs for Monte Carlo simulations of electron transport through gaseous tetrahydrofuran (THF) in the energy range 1–100 eV, has been critically compiled in this study. Accurate measurements of total electron scattering cross sections (TCSs) from THF have been obtained, and considered as reference values to validate the self-consistency of the proposed data set. Monte Carlo simulations of the magnetically confined electron transport through a gas cell containing THF for different beam energies (3, 10 and 70 eV) and pressures (2.5 and 5.0 mTorr) have also been performed by using a novel code developed in Madrid. In order to probe the accuracy of the proposed data set, the simulated results have been compared with the corresponding experimental data, the latter obtained with the same experimental configuration where the TCSs have been measured. Graphic Abstract


1991 ◽  
Vol 46 (4) ◽  
pp. 357-362 ◽  
Author(s):  
Bernd M. Rode ◽  
Saiful M. Islam

Abstract Monte Carlo simulations for a Cu2+ ion in infinitely dilute aqueous solution were performed on the basis of a simple pair potential function leading to a first-shell coordination number of 8, in contrast to experimental data. A simple method was introduced therefore, which allows the direct construction of a pair potential containing the most relevant 3-body interactions by means of a correction for the nearest neighbour ligands in the ion's first hydration shell. This procedure leads to much improved results, without significant increase in computational effort during potential construction and simulation


2019 ◽  
Vol 20 (22) ◽  
pp. 5648 ◽  
Author(s):  
Daniela Salado-Leza ◽  
Ali Traore ◽  
Erika Porcel ◽  
Diana Dragoe ◽  
Antonio Muñoz ◽  
...  

The use of nanoparticles, in combination with ionizing radiation, is considered a promising method to improve the performance of radiation therapies. In this work, we engineered mono- and bimetallic core-shell gold–platinum nanoparticles (NPs) grafted with poly (ethylene glycol) (PEG). Their radio-enhancing properties were investigated using plasmids as bio-nanomolecular probes and gamma radiation. We found that the presence of bimetallic Au:Pt-PEG NPs increased by 90% the induction of double-strand breaks, the signature of nanosize biodamage, and the most difficult cell lesion to repair. The radio-enhancement of Au:Pt-PEG NPs were found three times higher than that of Au-PEG NPs. This effect was scavenged by 80% in the presence of dimethyl sulfoxide, demonstrating the major role of hydroxyl radicals in the damage induction. Geant4-DNA Monte Carlo simulations were used to elucidate the physical processes involved in the radio-enhancement. We predicted enhancement factors of 40% and 45% for the induction of nanosize damage, respectively, for mono- and bimetallic nanoparticles, which is attributed to secondary electron impact processes. This work contributed to a better understanding of the interplay between energy deposition and the induction of nanosize biomolecular damage, being Monte Carlo simulations a simple method to guide the synthesis of new radio-enhancing agents.


1993 ◽  
Vol 77 (2) ◽  
pp. 377-378 ◽  
Author(s):  
John Paul Szalai

Kappa on a single item Ksi is proposed as a measure of the interrater agreement when a single item or object is rated by multiple raters. A statistical test and Monte Carlo simulations are provided for testing the statistical significance of Ksi beyond chance agreement.


Author(s):  
Mazen Nassar ◽  
Ahmed Z. Afify ◽  
Mohammed Shakhatreh

This paper addresses the estimation of the unknown parameters of the alphapower exponential distribution (Mahdavi and Kundu, 2017) using nine frequentist estimation methods. We discuss the nite sample properties of the parameterestimates of the alpha power exponential distribution via Monte Carlo simulations. The potentiality of the distribution is analyzed by means of two real datasets from the elds of engineering and medicine. Finally, we use the maximumlikelihood method to derive the estimates of the distribution parameters undercompeting risks data and analyze one real data set.


2007 ◽  
Vol 11 (2) ◽  
pp. 851-862 ◽  
Author(s):  
W. Wang ◽  
P. H. A. J. M. Van Gelder ◽  
J. K. Vrijling ◽  
X. Chen

Abstract. The Lo's modified rescaled adjusted range test (R/S test) (Lo, 1991), GPH test (Geweke and Porter-Hudak, 1983) and two approximate maximum likelihood estimation methods, i.e., Whittle's estimator (W-MLE) and another one implemented in S-Plus (S-MLE) based on the algorithm of Haslett and Raftery (1989) are evaluated through intensive Monte Carlo simulations for detecting the existence of long-memory. It is shown that it is difficult to find an appropriate lag q for Lo's test for different short-memory autoregressive (AR) and fractionally integrated autoregressive and moving average (ARFIMA) processes, which makes the use of Lo's test very tricky. In general, the GPH test outperforms the Lo's test, but for cases where a strong short-range dependence exists (e.g., AR(1) processes with φ=0.95 or even 0.99), the GPH test gets useless, even for time series of large data size. On the other hand, the estimates of d given by S-MLE and W-MLE seem to give a good indication of whether or not the long-memory is present. The simulation results show that data size has a significant impact on the power of all the four methods because the availability of larger samples allows one to inspect the asymptotical properties better. Generally, the power of Lo's test and GPH test increases with increasing data size, and the estimates of d with GPH method, S-MLE method and W-MLE method converge with increasing data size. If no large enough data set is available, we should be aware of the possible bias of the estimates. The four methods are applied to daily average discharge series recorded at 31 gauging stations with different drainage areas in eight river basins in Europe, Canada and USA to detect the existence of long-memory. The results show that the presence of long-memory in 29 daily series is confirmed by at least three methods, whereas the other two series are indicated to be long-memory processes with two methods. The intensity of long-memory in daily streamflow processes has only a very weak positive relationship with the scale of watershed.


2021 ◽  
Vol 9 (11) ◽  
pp. 202-213
Author(s):  
J. Wanliss ◽  
R. Hernandez Arriaza ◽  
G. Wanliss ◽  
S. Gordon

Background and Objective: Higuchi’s method of determining fractal dimension (HFD) occupies a valuable place in the study of a wide variety of physical signals. In comparison to other methods, it provides more rapid, accurate estimations for the entire range of possible fractal dimensions. However, a major difficulty in using the method is the correct choice of tuning parameter (kmax) to compute the most accurate results. In the past researchers have used various ad hoc methods to determine the appropriate kmax choice for their particular data. We provide a more objective method of determining, a priori, the best value for the tuning parameter, given a particular length data set. Methods: We create numerous simulations of fractional Brownian motion to perform Monte Carlo simulations of the distribution of the calculated HFD. Results: Experimental results show that HFD depends not only on kmax but also on the length of the time series, which enable derivation of an expression to find the appropriate kmax for an input time series of unknown fractal dimension. Conclusion: The Higuchi method should not be used indiscriminately without reference to the type of data whose fractal dimension is examined. Monte Carlo simulations with different fractional Brownian motions increases the confidence of evaluation results.


ISRN Ecology ◽  
2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Eric Marcon ◽  
Stéphane Traissac ◽  
Gabriel Lang

Ripley’s K function is the classical tool to characterize the spatial structure of point patterns. It is widely used in vegetation studies. Testing its values against a null hypothesis usually relies on Monte-Carlo simulations since little is known about its distribution. We introduce a statistical test against complete spatial randomness (CSR). The test returns the P value to reject the null hypothesis of independence between point locations. It is more rigorous and faster than classical Monte-Carlo simulations. We show how to apply it to a tropical forest plot. The necessary R code is provided.


Sign in / Sign up

Export Citation Format

Share Document