scholarly journals Detecting Lineage-Specific Shifts in Diversification: A Proper Likelihood Approach

Author(s):  
Giovanni Laudanno ◽  
Bart Haegeman ◽  
Daniel L Rabosky ◽  
Rampal S Etienne

Abstract The branching patterns of molecular phylogenies are generally assumed to contain information on rates of the underlying speciation and extinction processes. Simple birth–death models with constant, time-varying, or diversity-dependent rates have been invoked to explain these patterns. They have one assumption in common: all lineages have the same set of diversification rates at a given point in time. It seems likely, however, that there is variability in diversification rates across subclades in a phylogenetic tree. This has inspired the construction of models that allow multiple rate regimes across the phylogeny, with instantaneous shifts between these regimes. Several methods exist for calculating the likelihood of a phylogeny under a specified mapping of diversification regimes and for performing inference on the most likely diversification history that gave rise to a particular phylogenetic tree. Here, we show that the likelihood computation of these methods is not correct. We provide a new framework to compute the likelihood correctly and show, with simulations of a single shift, that the correct likelihood indeed leads to parameter estimates that are on average in much better agreement with the generating parameters than the incorrect likelihood. Moreover, we show that our corrected likelihood can be extended to multiple rate shifts in time-dependent and diversity-dependent models. We argue that identifying shifts in diversification rates is a nontrivial model selection exercise where one has to choose whether shifts in now-extinct lineages are taken into account or not. Hence, our framework also resolves the recent debate on such unobserved shifts. [Diversification; macroevolution; phylogeny; speciation]

BMC Medicine ◽  
2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Sahamoddin Khailaie ◽  
Tanmay Mitra ◽  
Arnab Bandyopadhyay ◽  
Marta Schips ◽  
Pietro Mascheroni ◽  
...  

Abstract Background SARS-CoV-2 has induced a worldwide pandemic and subsequent non-pharmaceutical interventions (NPIs) to control the spread of the virus. As in many countries, the SARS-CoV-2 pandemic in Germany has led to a consecutive roll-out of different NPIs. As these NPIs have (largely unknown) adverse effects, targeting them precisely and monitoring their effectiveness are essential. We developed a compartmental infection dynamics model with specific features of SARS-CoV-2 that allows daily estimation of a time-varying reproduction number and published this information openly since the beginning of April 2020. Here, we present the transmission dynamics in Germany over time to understand the effect of NPIs and allow adaptive forecasts of the epidemic progression. Methods We used a data-driven estimation of the evolution of the reproduction number for viral spreading in Germany as well as in all its federal states using our model. Using parameter estimates from literature and, alternatively, with parameters derived from a fit to the initial phase of COVID-19 spread in different regions of Italy, the model was optimized to fit data from the Robert Koch Institute. Results The time-varying reproduction number (Rt) in Germany decreased to <1 in early April 2020, 2–3 weeks after the implementation of NPIs. Partial release of NPIs both nationally and on federal state level correlated with moderate increases in Rt until August 2020. Implications of state-specific Rt on other states and on national level are characterized. Retrospective evaluation of the model shows excellent agreement with the data and usage of inpatient facilities well within the healthcare limit. While short-term predictions may work for a few weeks, long-term projections are complicated by unpredictable structural changes. Conclusions The estimated fraction of immunized population by August 2020 warns of a renewed outbreak upon release of measures. A low detection rate prolongs the delay reaching a low case incidence number upon release, showing the importance of an effective testing-quarantine strategy. We show that real-time monitoring of transmission dynamics is important to evaluate the extent of the outbreak, short-term projections for the burden on the healthcare system, and their response to policy changes.


Complexity ◽  
2018 ◽  
Vol 2018 ◽  
pp. 1-10 ◽  
Author(s):  
Min Zheng ◽  
Tangqing Yuan ◽  
Tao Huang

In order to guarantee the passivity of a kind of conservative system, the port Hamiltonian framework combined with a new energy tank is proposed in this paper. A time-varying impedance controller is designed based on this new framework. The time-varying impedance control method is an extension of conventional impedance control and overcomes the singularity problem that existed in the traditional form of energy tank. The validity of the controller designed in this paper is shown by numerical examples. The simulation results show that the proposed controller can not only eliminate the singularity problem but can also improve the control performance.


2018 ◽  
Vol 38 (8) ◽  
pp. 904-916 ◽  
Author(s):  
Aasthaa Bansal ◽  
Patrick J. Heagerty

Many medical decisions involve the use of dynamic information collected on individual patients toward predicting likely transitions in their future health status. If accurate predictions are developed, then a prognostic model can identify patients at greatest risk for future adverse events and may be used clinically to define populations appropriate for targeted intervention. In practice, a prognostic model is often used to guide decisions at multiple time points over the course of disease, and classification performance (i.e., sensitivity and specificity) for distinguishing high-risk v. low-risk individuals may vary over time as an individual’s disease status and prognostic information change. In this tutorial, we detail contemporary statistical methods that can characterize the time-varying accuracy of prognostic survival models when used for dynamic decision making. Although statistical methods for evaluating prognostic models with simple binary outcomes are well established, methods appropriate for survival outcomes are less well known and require time-dependent extensions of sensitivity and specificity to fully characterize longitudinal biomarkers or models. The methods we review are particularly important in that they allow for appropriate handling of censored outcomes commonly encountered with event time data. We highlight the importance of determining whether clinical interest is in predicting cumulative (or prevalent) cases over a fixed future time interval v. predicting incident cases over a range of follow-up times and whether patient information is static or updated over time. We discuss implementation of time-dependent receiver operating characteristic approaches using relevant R statistical software packages. The statistical summaries are illustrated using a liver prognostic model to guide transplantation in primary biliary cirrhosis.


2017 ◽  
Author(s):  
Rebecca L. Koscik ◽  
Derek L. Norton ◽  
Samantha L. Allison ◽  
Erin M. Jonaitis ◽  
Lindsay R. Clark ◽  
...  

ObjectiveIn this paper we apply Information-Theoretic (IT) model averaging to characterize a set of complex interactions in a longitudinal study on cognitive decline. Prior research has identified numerous genetic (including sex), education, health and lifestyle factors that predict cognitive decline. Traditional model selection approaches (e.g., backward or stepwise selection) attempt to find models that best fit the observed data; these techniques risk interpretations that only the selected predictors are important. In reality, several models may fit similarly well but result in different conclusions (e.g., about size and significance of parameter estimates); inference from traditional model selection approaches can lead to overly confident conclusions.MethodHere we use longitudinal cognitive data from ~1550 late-middle aged adults the Wisconsin Registry for Alzheimer’s Prevention study to examine the effects of sex, Apolipoprotein E (APOE) ɛ4 allele (non-modifiable factors), and literacy achievement (modifiable) on cognitive decline. For each outcome, we applied IT model averaging to a model set with combinations of interactions among sex, APOE, literacy, and age.ResultsFor a list-learning test, model-averaged results showed better performance for women vs men, with faster decline among men; increased literacy was associated with better performance, particularly among men. APOE had less of an effect on cognitive performance in this age range (~40-70).ConclusionsThese results illustrate the utility of the IT approach and point to literacy as a potential modifier of decline. Whether the protective effect of literacy is due to educational attainment or intrinsic verbal intellectual ability is the topic of ongoing work.


2020 ◽  
Vol 17 (173) ◽  
pp. 20200886
Author(s):  
L. Mihaela Paun ◽  
Mitchel J. Colebank ◽  
Mette S. Olufsen ◽  
Nicholas A. Hill ◽  
Dirk Husmeier

This study uses Bayesian inference to quantify the uncertainty of model parameters and haemodynamic predictions in a one-dimensional pulmonary circulation model based on an integration of mouse haemodynamic and micro-computed tomography imaging data. We emphasize an often neglected, though important source of uncertainty: in the mathematical model form due to the discrepancy between the model and the reality, and in the measurements due to the wrong noise model (jointly called ‘model mismatch’). We demonstrate that minimizing the mean squared error between the measured and the predicted data (the conventional method) in the presence of model mismatch leads to biased and overly confident parameter estimates and haemodynamic predictions. We show that our proposed method allowing for model mismatch, which we represent with Gaussian processes, corrects the bias. Additionally, we compare a linear and a nonlinear wall model, as well as models with different vessel stiffness relations. We use formal model selection analysis based on the Watanabe Akaike information criterion to select the model that best predicts the pulmonary haemodynamics. Results show that the nonlinear pressure–area relationship with stiffness dependent on the unstressed radius predicts best the data measured in a control mouse.


2016 ◽  
Author(s):  
Joram Soch ◽  
Achim Pascal Meyer ◽  
John-Dylan Haynes ◽  
Carsten Allefeld

AbstractIn functional magnetic resonance imaging (fMRI), model quality of general linear models (GLMs) for first-level analysis is rarely assessed. In recent work (Soch et al., 2016: “How to avoid mismodelling in GLM-based fMRI data analysis: cross-validated Bayesian model selection”, NeuroImage, vol. 141, pp. 469-489; DOI: 10.1016/j. neuroimage.2016.07.047), we have introduced cross-validated Bayesian model selection (cvBMS) to infer the best model for a group of subjects and use it to guide second-level analysis. While this is the optimal approach given that the same GLM has to be used for all subjects, there is a much more efficient procedure when model selection only addresses nuisance variables and regressors of interest are included in all candidate models. In this work, we propose cross-validated Bayesian model averaging (cvBMA) to improve parameter estimates for these regressors of interest by combining information from all models using their posterior probabilities. This is particularly useful as different models can lead to different conclusions regarding experimental effects and the most complex model is not necessarily the best choice. We find that cvBMS can prevent not detecting established effects and that cvBMA can be more sensitive to experimental effects than just using even the best model in each subject or the model which is best in a group of subjects.


2016 ◽  
Vol 23 (2) ◽  
pp. 448-459 ◽  
Author(s):  
Richard T. Melstrom

This article presents an exponential model of tourist expenditures estimated by a quasi-maximum likelihood (QML) technique. The advantage of this approach is that, unlike conventional OLS and Tobit estimators, it produces consistent parameter estimates under conditions of a corner solution at zero and heteroscedasticity. An application to sportfishing evaluates the role of socioeconomic demographics and species preferences on trip spending. The bias from an inappropriate estimator is illustrated by comparing the results from QML and OLS estimation, which shows that OLS significantly overstates the impact of trip duration on trip expenditures compared with the QML estimator. Both sets of estimates imply that trout and bass anglers spend significantly more on their fishing trips compared with other anglers.


2014 ◽  
Vol 625 ◽  
pp. 229-232 ◽  
Author(s):  
Abul Hassan Ali ◽  
Atif Muhammad Ashraf ◽  
Azmi Mohd Shariff ◽  
Saibal Ganguly

The paper presents the concept of cryogenic growth kinetics during separation of CO2from natural gas using Avrami nucleation model. The interface frost layer on the glass packing of cryogenic bed is assumed asgerm nuclei. The bed porosity is considered time dependent. The expression for time varying bed porosity is derived based on Avrami model. The experimentation was conducted to validate the model and the resulting simulation studies show good resemblance with experimental results.


Sign in / Sign up

Export Citation Format

Share Document