scholarly journals Integration by Parts for Point Processes and Monte Carlo Estimation

2007 ◽  
Vol 44 (3) ◽  
pp. 806-823 ◽  
Author(s):  
Nicolas Privault ◽  
Xiao Wei

We develop an integration by parts technique for point processes, with application to the computation of sensitivities via Monte Carlo simulations in stochastic models with jumps. The method is applied to density estimation with respect to the Lebesgue measure via a modified kernel estimator which is less sensitive to variations of the bandwidth parameter than standard kernel estimators. This applies to random variables whose densities are not analytically known and requires the knowledge of the point process jump times.

2007 ◽  
Vol 44 (03) ◽  
pp. 806-823
Author(s):  
Nicolas Privault ◽  
Xiao Wei

We develop an integration by parts technique for point processes, with application to the computation of sensitivities via Monte Carlo simulations in stochastic models with jumps. The method is applied to density estimation with respect to the Lebesgue measure via a modified kernel estimator which is less sensitive to variations of the bandwidth parameter than standard kernel estimators. This applies to random variables whose densities are not analytically known and requires the knowledge of the point process jump times.


2020 ◽  
Vol 19 (5) ◽  
pp. 393-405
Author(s):  
M. Lares ◽  
J. G. Funes ◽  
L. Gramajo

AbstractIn this work we address the problem of estimating the probabilities of causal contacts between civilizations in the Galaxy. We make no assumptions regarding the origin and evolution of intelligent life. We simply assume a network of causally connected nodes. These nodes refer somehow to intelligent agents with the capacity of receiving and emitting electromagnetic signals. Here we present a three-parametric statistical Monte Carlo model of the network in a simplified sketch of the Galaxy. Our goal, using Monte Carlo simulations, is to explore the parameter space and analyse the probabilities of causal contacts. We find that the odds to make a contact over decades of monitoring are low for most models, except for those of a galaxy densely populated with long-standing civilizations. We also find that the probability of causal contacts increases with the lifetime of civilizations more significantly than with the number of active civilizations. We show that the maximum probability of making a contact occurs when a civilization discovers the required communication technology.


2010 ◽  
Vol 12 (01) ◽  
pp. 87-101
Author(s):  
OSAMA A. B. HASSAN

This article attempts to adapt the Monte Carlo method to the quantitative risk management of environmental pollution. In this context, the feasibility of stochastic models to quantitatively evaluate the risk of chemical pollution is first discussed and then linked to a case study in which Monte Carlo simulations are applied. The objective of the case study is to develop a Monte Carlo scheme for evaluating the pollution in a lake environment. It is shown that the results can be of interest as they define the risk margins that are important to the sustainability of the ecosystem in general, and human health in particular. Moreover, assessing the environmental pollution with the help of the Monte Carlo method can be feasible and serve the purpose of investigating and controlling the environmental pollution, in the long and short terms.


2021 ◽  
Vol 27 (1) ◽  
pp. 57-69
Author(s):  
Yasmina Ziane ◽  
Nabil Zougab ◽  
Smail Adjabi

Abstract In this paper, we consider the procedure for deriving variable bandwidth in univariate kernel density estimation for nonnegative heavy-tailed (HT) data. These procedures consider the Birnbaum–Saunders power-exponential (BS-PE) kernel estimator and the bayesian approach that treats the adaptive bandwidths. We adapt an algorithm that subdivides the HT data set into two regions, high density region (HDR) and low-density region (LDR), and we assign a bandwidth parameter for each region. They are derived by using a Monte Carlo Markov chain (MCMC) sampling algorithm. A series of simulation studies and real data are realized for evaluating the performance of a procedure proposed.


2009 ◽  
Vol 21 (10) ◽  
pp. 2894-2930 ◽  
Author(s):  
Yiwen Wang ◽  
António R. C. Paiva ◽  
José C. Príncipe ◽  
Justin C. Sanchez

Many decoding algorithms for brain machine interfaces' (BMIs) estimate hand movement from binned spike rates, which do not fully exploit the resolution contained in spike timing and may exclude rich neural dynamics from the modeling. More recently, an adaptive filtering method based on a Bayesian approach to reconstruct the neural state from the observed spike times has been proposed. However, it assumes and propagates a gaussian distributed state posterior density, which in general is too restrictive. We have also proposed a sequential Monte Carlo estimation methodology to reconstruct the kinematic states directly from the multichannel spike trains. This letter presents a systematic testing of this algorithm in a simulated neural spike train decoding experiment and then in BMI data. Compared to a point-process adaptive filtering algorithm with a linear observation model and a gaussian approximation (the counterpart for point processes of the Kalman filter), our sequential Monte Carlo estimation methodology exploits a detailed encoding model (tuning function) derived for each neuron from training data. However, this added complexity is translated into higher performance with real data. To deal with the intrinsic spike randomness in online modeling, several synthetic spike trains are generated from the intensity function estimated from the neurons and utilized as extra model inputs in an attempt to decrease the variance in the kinematic predictions. The performance of the sequential Monte Carlo estimation methodology augmented with this synthetic spike input provides improved reconstruction, which raises interesting questions and helps explain the overall modeling requirements better.


Author(s):  
Matthew T. Johnson ◽  
Ian M. Anderson ◽  
Jim Bentley ◽  
C. Barry Carter

Energy-dispersive X-ray spectrometry (EDS) performed at low (≤ 5 kV) accelerating voltages in the SEM has the potential for providing quantitative microanalytical information with a spatial resolution of ∼100 nm. In the present work, EDS analyses were performed on magnesium ferrite spinel [(MgxFe1−x)Fe2O4] dendrites embedded in a MgO matrix, as shown in Fig. 1. spatial resolution of X-ray microanalysis at conventional accelerating voltages is insufficient for the quantitative analysis of these dendrites, which have widths of the order of a few hundred nanometers, without deconvolution of contributions from the MgO matrix. However, Monte Carlo simulations indicate that the interaction volume for MgFe2O4 is ∼150 nm at 3 kV accelerating voltage and therefore sufficient to analyze the dendrites without matrix contributions.Single-crystal {001}-oriented MgO was reacted with hematite (Fe2O3) powder for 6 h at 1450°C in air and furnace cooled. The specimen was then cleaved to expose a clean cross-section suitable for microanalysis.


1979 ◽  
Vol 40 (C7) ◽  
pp. C7-63-C7-64
Author(s):  
A. J. Davies ◽  
J. Dutton ◽  
C. J. Evans ◽  
A. Goodings ◽  
P.K. Stewart

Sign in / Sign up

Export Citation Format

Share Document