scholarly journals Bayesian and parsimony approaches reconstruct informative trees from simulated morphological datasets

2019 ◽  
Vol 15 (2) ◽  
pp. 20180632 ◽  
Author(s):  
Martin R. Smith

Phylogenetic analysis aims to establish the true relationships between taxa. Different analytical methods, however, can reach different conclusions. In order to establish which approach best reconstructs true relationships, previous studies have simulated datasets from known tree topologies, and identified the method that reconstructs the generative tree most accurately. On this basis, researchers have argued that morphological datasets should be analysed by Bayesian approaches, which employ an explicit probabilistic model of evolution, rather than parsimony methods—with implied weights parsimony sometimes identified as particularly inaccurate. Accuracy alone, however, is an inadequate measure of a tree's utility: a fully unresolved tree is perfectly accurate, yet contains no phylogenetic information. The highly resolved trees recovered by implied weights parsimony in fact contain as much useful information as the more accurate, but less resolved, trees recovered by Bayesian methods. By collapsing poorly supported groups, this superior resolution can be traded for accuracy, resulting in trees as accurate as those obtained by a Bayesian approach. By contrast, equally weighted parsimony analysis produces trees that are less resolved and less accurate, leading to less reliable evolutionary conclusions.

2020 ◽  
Author(s):  
Benedict King

Abstract The incorporation of stratigraphic data into phylogenetic analysis has a long history of debate but is not currently standard practice for paleontologists. Bayesian tip-dated (or morphological clock) phylogenetic methods have returned these arguments to the spotlight, but how tip dating affects the recovery of evolutionary relationships has yet to be fully explored. Here I show, through analysis of several data sets with multiple phylogenetic methods, that topologies produced by tip dating are outliers as compared to topologies produced by parsimony and undated Bayesian methods, which retrieve broadly similar trees. Unsurprisingly, trees recovered by tip dating have better fit to stratigraphy than trees recovered by other methods under both the Gap Excess Ratio (GER) and the Stratigraphic Completeness Index (SCI). This is because trees with better stratigraphic fit are assigned a higher likelihood by the fossilized birth-death tree model. However, the degree to which the tree model favors tree topologies with high stratigraphic fit metrics is modulated by the diversification dynamics of the group under investigation. In particular, when net diversification rate is low, the tree model favors trees with a higher GER compared to when net diversification rate is high. Differences in stratigraphic fit and tree topology between tip dating and other methods are concentrated in parts of the tree with weaker character signal, as shown by successive deletion of the most incomplete taxa from two data sets. These results show that tip dating incorporates stratigraphic data in an intuitive way, with good stratigraphic fit an expectation that can be overturned by strong evidence from character data. [fossilized birth-death; fossils; missing data; morphological clock; morphology; parsimony; phylogenetics.]


2018 ◽  
Vol 10 (10) ◽  
pp. 3671
Author(s):  
Jongseon Jeon ◽  
Suneung Ahn

The work proposed a reliability demonstration test (RDT) process, which can be employed to determine whether a finite population is accepted or rejected. Bayesian and non-Bayesian approaches were compared in the proposed RDT process, as were lot and sequential sampling. One-shot devices, such as bullets, fire extinguishers, and grenades, were used as test targets, with their functioning state expressible as a binary model. A hypergeometric distribution was adopted as the likelihood function for a finite population consisting of binary items. It was demonstrated that a beta-binomial distribution was the conjugate prior of the hypergeometric likelihood function. According to the Bayesian approach, the posterior beta-binomial distribution is used to decide on the acceptance or rejection of the population in the RDT. The proposed method in this work could be used to select item providers in a supply chain, who guarantee a predetermined reliability target and confidence level. Numerical examples show that a Bayesian approach with sequential sampling has the advantage of only requiring a small sample size to determine the acceptance of a finite population.


2019 ◽  
Vol 11 (10) ◽  
pp. 2824-2849 ◽  
Author(s):  
Paweł Mackiewicz ◽  
Adam Dawid Urantówka ◽  
Aleksandra Kroczak ◽  
Dorota Mackiewicz

Abstract Mitochondrial genes are placed on one molecule, which implies that they should carry consistent phylogenetic information. Following this advantage, we present a well-supported phylogeny based on mitochondrial genomes from almost 300 representatives of Passeriformes, the most numerous and differentiated Aves order. The analyses resolved the phylogenetic position of paraphyletic Basal and Transitional Oscines. Passerida occurred divided into two groups, one containing Paroidea and Sylvioidea, whereas the other, Passeroidea and Muscicapoidea. Analyses of mitogenomes showed four types of rearrangements including a duplicated control region (CR) with adjacent genes. Mapping the presence and absence of duplications onto the phylogenetic tree revealed that the duplication was the ancestral state for passerines and was maintained in early diverged lineages. Next, the duplication could be lost and occurred independently at least four times according to the most parsimonious scenario. In some lineages, two CR copies have been inherited from an ancient duplication and highly diverged, whereas in others, the second copy became similar to the first one due to concerted evolution. The second CR copies accumulated over twice as many substitutions as the first ones. However, the second CRs were not completely eliminated and were retained for a long time, which suggests that both regions can fulfill an important role in mitogenomes. Phylogenetic analyses based on CR sequences subjected to the complex evolution can produce tree topologies inconsistent with real evolutionary relationships between species. Passerines with two CRs showed a higher metabolic rate in relation to their body mass.


2001 ◽  
Vol 17 (1) ◽  
pp. 114-122 ◽  
Author(s):  
Steven H. Sheingold

Decision making in health care has become increasingly reliant on information technology, evidence-based processes, and performance measurement. It is therefore a time at which it is of critical importance to make data and analyses more relevant to decision makers. Those who support Bayesian approaches contend that their analyses provide more relevant information for decision making than do classical or “frequentist” methods, and that a paradigm shift to the former is long overdue. While formal Bayesian analyses may eventually play an important role in decision making, there are several obstacles to overcome if these methods are to gain acceptance in an environment dominated by frequentist approaches. Supporters of Bayesian statistics must find more accommodating approaches to making their case, especially in finding ways to make these methods more transparent and accessible. Moreover, they must better understand the decision-making environment they hope to influence. This paper discusses these issues and provides some suggestions for overcoming some of these barriers to greater acceptance.


2007 ◽  
Vol 130 (1-2) ◽  
pp. 53-62 ◽  
Author(s):  
Hongyan Xia ◽  
Lihong Liu ◽  
Niklas Wahlberg ◽  
Claudia Baule ◽  
Sándor Belák

Author(s):  
Emanuel TSCHOPP ◽  
Paul UPCHURCH

ABSTRACTSpecimen-level phylogenetic approaches are widely used in molecular biology for taxonomic and systematic purposes. However, they have been largely ignored in analyses based on morphological traits, where phylogeneticists mostly resort to species-level analyses. Recently, a number of specimen-level studies have been published in vertebrate palaeontology. These studies indicate that specimen-level phylogeny may be a very useful tool for systematic reassessments at low taxonomic levels. Herein, we review the challenges when working with individual organisms as operational taxonomic units in a palaeontological context, and propose guidelines of how best to perform a specimen-level phylogenetic analysis using the maximum parsimony criterion. Given that no single methodology appears to be perfectly suited to resolve relationships among individuals, and that different taxa probably require different approaches to assess their systematics, we advocate the use of a number of methodologies. In particular, we recommend the inclusion of as many specimens and characters as feasible, and the analysis of relationships using an extended implied weighting approach with different downweighting functions. Resulting polytomies should be explored using a posteriori pruning of unstable specimens, and conflicting tree topologies between different iterations of the analysis should be evaluated by a combination of support values such as jackknifing and symmetric resampling. Species delimitation should be consistent among the ingroup and based on a reproducible approach. Although time-consuming and methodologically challenging, specimen-level phylogenetic analysis is a highly useful tool to assess intraspecific variability and provide the basis for a more informed and accurate creation of species-level operational taxonomic units in large-scale systematic studies. It also has the potential to inform us about past speciation processes, morphological trait evolution, and their potential intrinsic and extrinsic drivers in pre-eminent detail.


2019 ◽  
pp. 1-9
Author(s):  
Ciara Nugent ◽  
Wentian Guo ◽  
Peter Müller ◽  
Yuan Ji

We review Bayesian and Bayesian decision theoretic approaches to subgroup analysis and applications to subgroup-based adaptive clinical trial designs. Subgroup analysis refers to inference about subpopulations with significantly distinct treatment effects. The discussion mainly focuses on inference for a benefiting subpopulation, that is, a characterization of a group of patients who benefit from the treatment under consideration more than the overall population. We introduce alternative approaches and demonstrate them with a small simulation study. Then, we turn to clinical trial designs. When the selection of the interesting subpopulation is carried out as the trial proceeds, the design becomes an adaptive clinical trial design, using subgroup analysis to inform the randomization and assignment of treatments to patients. We briefly review some related designs. There are a variety of approaches to Bayesian subgroup analysis. Practitioners should consider the type of subpopulations in which they are interested and choose their methods accordingly. We demonstrate how subgroup analysis can be carried out by different Bayesian methods and discuss how they identify slightly different subpopulations.


1999 ◽  
Vol 64 (1) ◽  
pp. 55-70 ◽  
Author(s):  
Robert G. Aykroyd ◽  
David Lucy ◽  
A. Mark Pollard ◽  
Charlotte A. Roberts

It is generally assumed that life expectancy in antiquity was considerably shorter than it is now. In the limited number of cases where skeletal or dental age-at-death estimates have been made on adults for whom there are other reliable indications of age, there appears to be a clear systematic trend towards overestimating the age of young adults, and underestimating that of older individuals. We show that this might be a result of the use of regression-based techniques of analysis for converting age indicators into estimated ages. Whilst acknowledging the limitations of most age-at-death indicators in the higher age categories, we show that a Bayesian approach to converting age indicators into estimated age can reduce this trend of underestimation at the older end. We also show that such a Bayesian approach can always do better than regression-based methods in terms of giving a smaller average difference between predicted age and known age, and a smaller average 95-percent confidence interval width of the estimate. Given these observations, we suggest that Bayesian approaches to converting age indicators into age estimates deserve further investigation. In view of the generality and flexibility of the approach, we also suggest that similar algorithms may have a much wider application.


2014 ◽  
Vol 27 (19) ◽  
pp. 7270-7284 ◽  
Author(s):  
Nicholas Lewis

Abstract Insight is provided into the use of objective-Bayesian methods for estimating climate sensitivity by considering their relationship to transformations of variables in the context of a simple case considered in a previous study, and some misunderstandings about Bayesian inference are discussed. A simple model in which climate sensitivity (S) and effective ocean heat diffusivity (Kυ) are the only parameters varied is used, with twentieth-century warming attributable to greenhouse gases (AW) and effective ocean heat capacity (HC) being the only data-based observables. Probability density functions (PDFs) for AW and HC are readily derived that represent valid independent objective-Bayesian posterior PDFs, provided the error distribution assumptions involved in their construction are justified. Using them, a standard transformation of variables provides an objective joint posterior PDF for S and Kυ; integrating out Kυ gives a marginal PDF for S. Close parametric approximations to the PDFs for AW and HC are obtained, enabling derivation of likelihood functions and related noninformative priors that give rise to the objective posterior PDFs that were computed initially. Bayes’s theorem is applied to the derived AW and HC likelihood functions, demonstrating the effect of differing prior distributions on PDFs for S. Use of the noninformative Jeffreys prior produces an identical PDF to that derived using the transformation-of-variables approach. It is shown that similar inference for S to that based on these two alternative objective-Bayesian approaches is obtained using a profile likelihood method on the derived joint likelihood function for AW and HC.


2019 ◽  
Vol 35 (02) ◽  
pp. 321-338
Author(s):  
Bengt Autzen

Abstract:While Bayesian methods are widely used in economics and finance, the foundations of this approach remain controversial. In the contemporary statistical literature Bayesian Ockham’s razor refers to the observation that the Bayesian approach to scientific inference will automatically assign greater likelihood to a simpler hypothesis if the data are compatible with both a simpler and a more complex hypothesis. In this paper I will discuss a problem that results when Bayesian Ockham’s razor is applied to nested economic models. I will argue that previous responses to the problem found in the philosophical literature are unsatisfactory and develop a novel reply to the problem.


Sign in / Sign up

Export Citation Format

Share Document