scholarly journals A GLOBAL MAXIMUM LIKELIHOOD SUPER-QUARTET PHYLOGENY METHOD

Author(s):  
P. WANG ◽  
B. B. ZHOU ◽  
M. TARAENEH ◽  
D. CHU ◽  
C. WANG ◽  
...  
PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0259111
Author(s):  
Frank Kwasniok

A comprehensive methodology for semiparametric probability density estimation is introduced and explored. The probability density is modelled by sequences of mostly regular or steep exponential families generated by flexible sets of basis functions, possibly including boundary terms. Parameters are estimated by global maximum likelihood without any roughness penalty. A statistically orthogonal formulation of the inference problem and a numerically stable and fast convex optimization algorithm for its solution are presented. Automatic model selection over the type and number of basis functions is performed with the Bayesian information criterion. The methodology can naturally be applied to densities supported on bounded, infinite or semi-infinite domains without boundary bias. Relationships to the truncated moment problem and the moment-constrained maximum entropy principle are discussed and a new theorem on the existence of solutions is contributed. The new technique compares very favourably to kernel density estimation, the diffusion estimator, finite mixture models and local likelihood density estimation across a diverse range of simulation and observation data sets. The semiparametric estimator combines a very small mean integrated squared error with a high degree of smoothness which allows for a robust and reliable detection of the modality of the probability density in terms of the number of modes and bumps.


2019 ◽  
Vol 29 (4) ◽  
pp. 1197-1211
Author(s):  
Brian H Willis ◽  
Mohammed Baragilly ◽  
Dyuti Coomar

A bivariate generalised linear mixed model is often used for meta-analysis of test accuracy studies. The model is complex and requires five parameters to be estimated. As there is no closed form for the likelihood function for the model, maximum likelihood estimates for the parameters have to be obtained numerically. Although generic functions have emerged which may estimate the parameters in these models, they remain opaque to many. From first principles we demonstrate how the maximum likelihood estimates for the parameters may be obtained using two methods based on Newton–Raphson iteration. The first uses the profile likelihood and the second uses the Observed Fisher Information. As convergence may depend on the proximity of the initial estimates to the global maximum, each algorithm includes a method for obtaining robust initial estimates. A simulation study was used to evaluate the algorithms and compare their performance with the generic generalised linear mixed model function glmer from the lme4 package in R before applying them to two meta-analyses from the literature. In general, the two algorithms had higher convergence rates and coverage probabilities than glmer. Based on its performance characteristics the method of profiling is recommended for fitting the bivariate generalised linear mixed model for meta-analysis.


Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1329
Author(s):  
Chanseok Park ◽  
Min Wang ◽  
Refah Mohammed Alotaibi ◽  
Hoda Rezk

A load-sharing system is defined as a parallel system whose load will be redistributed to its surviving components as each of the components fails in the system. Our focus is on making statistical inference of the parameters associated with the lifetime distribution of each component in the system. In this paper, we introduce a methodology which integrates the conventional procedure under the assumption of the load-sharing system being made up of fundamental hypothetical latent random variables. We then develop an expectation maximization algorithm for performing the maximum likelihood estimation of the system with Lindley-distributed component lifetimes. We adopt several standard simulation techniques to compare the performance of the proposed methodology with the Newton–Raphson-type algorithm for the maximum likelihood estimate of the parameter. Numerical results indicate that the proposed method is more effective by consistently reaching a global maximum.


2010 ◽  
Vol 23 (7) ◽  
pp. 917-925 ◽  
Author(s):  
Cristiano Cervellera ◽  
Danilo Macciò ◽  
Marco Muselli

1991 ◽  
Vol 127 ◽  
pp. 335-338
Author(s):  
Kavan U. Ratnatunga ◽  
Wayne H. Warren

AbstractA model for the kinematic distribution function of our Galaxy can be used as an independent confirmation that a reference system is free of Earth motions and retains the true kinematics of the stars. Maximum likelihood can simultaneously estimate the parameters required to calibrate distances to the stars, represent the kinematic distribution function, and check on residual Earth rotations in the proper-motion system. The global maximum-likelihood analysis uses all available information: photometry, trigonometric parallax, proper motion, and line-of-sight velocity for a well-defined catalog of stars. Awaiting observations from HIPPARCOS, preliminary testing of the algorithm on available ground-based observations is discussed.


2019 ◽  
Vol 29 (11n12) ◽  
pp. 1835-1850
Author(s):  
Joaquim Assunção ◽  
Paulo Fernandes ◽  
Jean-Marc Vincent

We propose a simple, fast, deterministic pre-fitting approach which derives the Baum–Welch algorithm initial values directly from the input data. Such pre-fitting has the purpose of improving the fitting time for a given Hidden Markov Model (HMM) while maintaining the original Baum–Welch algorithm as the fitting one. The fitting time is improved by avoiding the Baum–Welch algorithm sensitiveness through the generation of parameters closer to the global maximum likelihood. Furthermore, by keeping the original Baum–Welch algorithm as the fitting one, we guarantee that all related methods will continue to work properly. On the other hand, the pre-fitting generates the HMM parameters directly derived from time-series data, without any data transformation, using an [Formula: see text] operation.


2004 ◽  
Vol 16 (12) ◽  
pp. 2533-2561 ◽  
Author(s):  
Liam Paninski ◽  
Jonathan W. Pillow ◽  
Eero P. Simoncelli

We examine a cascade encoding model for neural response in which a linear filtering stage is followed by a noisy, leaky, integrate-and-fire spike generation mechanism. This model provides a biophysically more realistic alternative to models based on Poisson (memoryless) spike generation, and can effectively reproduce a variety of spiking behaviors seen in vivo. We describe the maximum likelihood estimator for the model parameters, given only extracellular spike train responses (not intracellular voltage data). Specifically, we prove that the log-likelihood function is concave and thus has an essentially unique global maximum that can be found using gradient ascent techniques. We develop an efficient algorithm for computing the maximum likelihood solution, demonstrate the effectiveness of the resulting estimator with numerical simulations, and discuss a method of testing the model's validity using time-rescaling and density evolution techniques.


2018 ◽  
Author(s):  
Michael D. Ward ◽  
John S. Ahlquist

Sign in / Sign up

Export Citation Format

Share Document