Assessing the quadratic approximation to the log likelihood function in nonnormal linear models

Biometrika ◽  
1983 ◽  
Vol 70 (2) ◽  
pp. 367-372 ◽  
Author(s):  
SALOMON MINKIN
2021 ◽  
Vol 2 (1) ◽  
Author(s):  
Anthony M. Orlando ◽  
Rahul Dhanda

It is interesting to note that the expected value of the log likelihood function is entropy. This note shows that there is an exact relationship between the mixture log likelihood function (ln LM) and the sum of the mixing distribution entropy (HM) and the mixture density entropy (HD). Ln LM is seen as a function exactly of four Shannon entropies, each a unique measure of uncertainty. This method, known as mixtures of linear models (MLM), is a form of empirical Bayes which uses a non-informative uniform prior and generates both confidence intervals and p-values which clinicians and regulatory agencies can use to evaluate scientific evidence. An example based on allergic rhinitis symptoms scores are given and show how easy it is to assess the fit of the model and evaluate the results of the trial.


Psych ◽  
2021 ◽  
Vol 3 (2) ◽  
pp. 197-232
Author(s):  
Yves Rosseel

This paper discusses maximum likelihood estimation for two-level structural equation models when data are missing at random at both levels. Building on existing literature, a computationally efficient expression is derived to evaluate the observed log-likelihood. Unlike previous work, the expression is valid for the special case where the model implied variance–covariance matrix at the between level is singular. Next, the log-likelihood function is translated to R code. A sequence of R scripts is presented, starting from a naive implementation and ending at the final implementation as found in the lavaan package. Along the way, various computational tips and tricks are given.


2004 ◽  
Vol 28 (1) ◽  
pp. 77-94 ◽  
Author(s):  
Yawpo Yang ◽  
Jen-Ning Chang ◽  
Ji-Chyun Liu ◽  
Ching-Hwa Liu

Author(s):  
Muhamad Alias Md. Jedi ◽  
Robiah Adnan

TCLUST is a method in statistical clustering technique which is based on modification of trimmed k-means clustering algorithm. It is called “crisp” clustering approach because the observation is can be eliminated or assigned to a group. TCLUST strengthen the group assignment by putting constraint to the cluster scatter matrix. The emphasis in this paper is to restrict on the eigenvalues, λ of the scatter matrix. The idea of imposing constraints is to maximize the log-likelihood function of spurious-outlier model. A review of different robust clustering approach is presented as a comparison to TCLUST methods. This paper will discuss the nature of TCLUST algorithm and how to determine the number of cluster or group properly and measure the strength of group assignment. At the end of this paper, R-package on TCLUST implement the types of scatter restriction, making the algorithm to be more flexible for choosing the number of clusters and the trimming proportion.


Author(s):  
Tim Loossens ◽  
Kristof Meers ◽  
Niels Vanhasbroeck ◽  
Nil Anarat ◽  
Stijn Verdonck ◽  
...  

AbstractComputational modeling plays an important role in a gamut of research fields. In affect research, continuous-time stochastic models are becoming increasingly popular. Recently, a non-linear, continuous-time, stochastic model has been introduced for affect dynamics, called the Affective Ising Model (AIM). The drawback of non-linear models like the AIM is that they generally come with serious computational challenges for parameter estimation and related statistical analyses. The likelihood function of the AIM does not have a closed form expression. Consequently, simulation based or numerical methods have to be considered in order to evaluate the likelihood function. Additionally, the likelihood function can have multiple local minima. Consequently, a global optimization heuristic is required and such heuristics generally require a large number of likelihood function evaluations. In this paper, a Julia software package is introduced that is dedicated to fitting the AIM. The package includes an implementation of a numeric algorithm for fast computations of the likelihood function, which can be run both on graphics processing units (GPU) and central processing units (CPU). The numerical method introduced in this paper is compared to the more traditional Euler-Maruyama method for solving stochastic differential equations. Furthermore, the estimation software is tested by means of a recovery study and estimation times are reported for benchmarks that were run on several computing devices (two different GPUs and three different CPUs). According to these results, a single parameter estimation can be obtained in less than thirty seconds using a mainstream NVIDIA GPU.


1998 ◽  
Vol 70 (1) ◽  
pp. 61-71 ◽  
Author(s):  
Yawpo Yang ◽  
Ching-Hwa Liu ◽  
Ta-Wei Soong

2011 ◽  
Vol 19 (3) ◽  
pp. 657-663
Author(s):  
聂宏宾 NIE Hong-bin ◽  
侯晴宇 HOU Qing-yu ◽  
赵明 ZHAO Ming ◽  
张伟 ZHANG Wei

Sign in / Sign up

Export Citation Format

Share Document