simulated likelihood
Recently Published Documents


TOTAL DOCUMENTS

70
(FIVE YEARS 17)

H-INDEX

17
(FIVE YEARS 2)

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Maksat Jumamyradov ◽  
Murat K. Munkin

Abstract This paper finds that the maximum simulated likelihood (MSL) estimator produces substantial biases when applied to the bivariate normal distribution. A specification of the random parameter bivariate normal model is considered, in which a direct comparison between the MSL and maximum likelihood (ML) estimators is feasible. The analysis shows that MSL produces biased results for the correlation parameter. This paper also finds that the MSL estimator is biased for the bivariate Poisson-lognormal model, developed by Munkin and Trivedi (1999. “Simulated Maximum Likelihood Estimation of Multivariate Mixed-Poisson Regression Models, with Application.” The Econometrics Journal 2: 29–48). A simulation study is conducted, which shows that MSL leads to serious inferential biases, especially large when variance parameters in the true data generating process are small. The MSL estimator produces biases in the estimated marginal effects, conditional means and probabilities of count outcomes.


Author(s):  
Peter G. Moffatt ◽  
Graciela Zevallos

AbstractWe consider a dictator game experiment in which dictators perform a sequence of giving tasks and taking tasks. The data are used to estimate the parameters of a Stone–Geary utility function over own-payoff and other’s payoff. The econometric model incorporates zero observations (e.g. zero-giving or zero-taking) by applying the Kuhn–Tucker theorem and treating zeros as corner solutions in the dictator’s constrained optimisation problem. The method of maximum simulated likelihood (MSL) is used for estimation. We find that selfishness is significantly lower in taking tasks than in giving tasks, and we attribute this difference to the “cold prickle of taking”.


2021 ◽  
Author(s):  
Prateek Bansal ◽  
Vahid Keshavarzzadeh ◽  
Angelo Guevara ◽  
Shanjun Li ◽  
Ricardo A Daziano

Abstract Maximum simulated likelihood estimation of mixed multinomial logit models requires evaluation of a multidimensional integral. Quasi-Monte Carlo (QMC) methods such as Halton sequences and modified Latin hypercube sampling are workhorse methods for integral approximation. Earlier studies explored the potential of sparse grid quadrature (SGQ), but SGQ suffers from negative weights. As an alternative to QMC and SGQ, we looked into the recently developed designed quadrature (DQ) method. DQ requires fewer nodes to get the same level of accuracy as of QMC and SGQ, is as easy to implement, ensures positivity of weights, and can be created on any general polynomial space. We benchmarked DQ against QMC in a Monte Carlo and an empirical study. DQ outperformed QMC in all considered scenarios, is practice-ready and has potential to become the workhorse method for integral approximation.


2021 ◽  
Vol 502 (3) ◽  
pp. 4093-4111
Author(s):  
Chun-Hao To ◽  
Elisabeth Krause ◽  
Eduardo Rozo ◽  
Hao-Yi Wu ◽  
Daniel Gruen ◽  
...  

ABSTRACT We present a method of combining cluster abundances and large-scale two-point correlations, namely galaxy clustering, galaxy–cluster cross-correlations, cluster autocorrelations, and cluster lensing. This data vector yields comparable cosmological constraints to traditional analyses that rely on small-scale cluster lensing for mass calibration. We use cosmological survey simulations designed to resemble the Dark Energy Survey Year 1 (DES-Y1) data to validate the analytical covariance matrix and the parameter inferences. The posterior distribution from the analysis of simulations is statistically consistent with the absence of systematic biases detectable at the precision of the DES-Y1 experiment. We compare the χ2 values in simulations to their expectation and find no significant difference. The robustness of our results against a variety of systematic effects is verified using a simulated likelihood analysis of DES-Y1-like data vectors. This work presents the first-ever end-to-end validation of a cluster abundance cosmological analysis on galaxy catalogue level simulations.


2020 ◽  
Author(s):  
Sanghyeok Lee ◽  
Tue Gørgens

Summary In this paper, we consider estimation of dynamic models of recurrent events (event histories) in continuous time using censored data. We develop maximum simulated likelihood estimators where missing data are integrated out using Monte Carlo and importance sampling methods. We allow for random effects and integrate out this unobserved heterogeneity using a quadrature rule. In Monte Carlo experiments, we find that maximum simulated likelihood estimation is practically feasible and performs better than both listwise deletion and auxiliary modelling of initial conditions. In an empirical application, we study ischaemic heart disease events for male Maoris in New Zealand.


2020 ◽  
Vol 498 (3) ◽  
pp. 4060-4087
Author(s):  
M Gatti ◽  
C Chang ◽  
O Friedrich ◽  
B Jain ◽  
D Bacon ◽  
...  

ABSTRACT We present a simulated cosmology analysis using the second and third moments of the weak lensing mass (convergence) maps. The second moment, or variances, of the convergence as a function of smoothing scale contains information similar to standard shear two-point statistics. The third moment, or the skewness, contains additional non-Gaussian information. The analysis is geared towards the third year (Y3) data from the Dark Energy Survey (DES), but the methodology can be applied to other weak lensing data sets. We present the formalism for obtaining the convergence maps from the measured shear and for obtaining the second and third moments of these maps given partial sky coverage. We estimate the covariance matrix from a large suite of numerical simulations. We test our pipeline through a simulated likelihood analyses varying 5 cosmological parameters and 10 nuisance parameters and identify the scales where systematic or modelling uncertainties are not expected to affect the cosmological analysis. Our simulated likelihood analysis shows that the combination of second and third moments provides a 1.5 per cent constraint on S8 ≡ σ8(Ωm/0.3)0.5 for DES Year 3 data. This is 20 per cent better than an analysis using a simulated DES Y3 shear two-point statistics, owing to the non-Gaussian information captured by the inclusion of higher order statistics. This paper validates our methodology for constraining cosmology with DES Year 3 data, which will be presented in a subsequent paper.


2020 ◽  
Vol 23 (05) ◽  
pp. 2050029
Author(s):  
MARKUS MICHAELSEN

In response to empirical evidence, we propose a continuous-time model for multivariate asset returns with a two-layered dependence structure. The price process is subject to multivariate information arrivals driving the market activity modeled by nondecreasing pure-jump Lévy processes. A Lévy copula determines the jump dependence and allows for a generic multivariate information flow with a flexible structure. Conditional on the information flow, asset returns are jointly normal. Within this setup, we provide an estimation framework based on maximum simulated likelihood. We apply novel multivariate models to equity data and obtain estimates which meet an economic intuition with respect to the two-layered dependence structure.


Sign in / Sign up

Export Citation Format

Share Document