scholarly journals Distributions of extinction times from fossil ages and tree topologies: the example of mid-Permian synapsid extinctions

PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e12577
Author(s):  
Gilles Didier ◽  
Michel Laurin

Given a phylogenetic tree that includes only extinct, or a mix of extinct and extant taxa, where at least some fossil data are available, we present a method to compute the distribution of the extinction time of a given set of taxa under the Fossilized-Birth-Death model. Our approach differs from the previous ones in that it takes into account (i) the possibility that the taxa or the clade considered may diversify before going extinct and (ii) the whole phylogenetic tree to estimate extinction times, whilst previous methods do not consider the diversification process and deal with each branch independently. Because of this, our method can estimate extinction times of lineages represented by a single fossil, provided that they belong to a clade that includes other fossil occurrences. We assess and compare our new approach with a standard previous one using simulated data. Results show that our method provides more accurate confidence intervals. This new approach is applied to the study of the extinction time of three Permo-Carboniferous synapsid taxa (Ophiacodontidae, Edaphosauridae, and Sphenacodontidae) that are thought to have disappeared toward the end of the Cisuralian (early Permian), or possibly shortly thereafter. The timing of extinctions of these three taxa and of their component lineages supports the idea that the biological crisis in the late Kungurian/early Roadian consisted of a progressive decline in biodiversity throughout the Kungurian.

2021 ◽  
Author(s):  
Gilles Didier ◽  
Michel Laurin

Given a phylogenetic tree of extinct and extant taxa with fossils where the only temporal information stands in the fossil ages, we devise a method to compute the distribution of the extinction time of a given set of taxa under the Fossilized-Birth-Death model. Our approach differs from the previous ones in that it takes into account the possibility that the taxa or the clade considered may diversify before going extinct, whilst previous methods just rely on the fossil recovery rate to estimate confidence intervals. We assess and compare our new approach with a standard previous one using simulated data. Results show that our method provides more accurate confidence intervals. This new approach is applied to the study of the extinction time of three Permo-Carboniferous synapsid taxa (Ophiacodontidae, Edaphosauridae, and Sphenacodontidae) that are thought to have disappeared toward the end of the Cisuralian, or possibly shortly thereafter. The timing of extinctions of these three taxa and of their component lineages supports the idea that a biological crisis occurred in the late Kungurian/early Roadian.


2021 ◽  
Vol 13 (11) ◽  
pp. 2069
Author(s):  
M. V. Alba-Fernández ◽  
F. J. Ariza-López ◽  
M. D. Jiménez-Gamero

The usefulness of the parameters (e.g., slope, aspect) derived from a Digital Elevation Model (DEM) is limited by its accuracy. In this paper, a thematic-like quality control (class-based) of aspect and slope classes is proposed. A product can be compared against a reference dataset, which provides the quality requirements to be achieved, by comparing the product proportions of each class with those of the reference set. If a distance between the product proportions and the reference proportions is smaller than a small enough positive tolerance, which is fixed by the user, it will be considered that the degree of similarity between the product and the reference set is acceptable, and hence that its quality meets the requirements. A formal statistical procedure, based on a hypothesis test, is developed and its performance is analyzed using simulated data. It uses the Hellinger distance between the proportions. The application to the slope and aspect is illustrated using data derived from a 2×2 m DEM (reference) and 5×5 m DEM in Allo (province of Navarra, Spain).


2020 ◽  
Author(s):  
Zeqi Yao ◽  
Kehui Liu ◽  
Shanjun Deng ◽  
Xionglei He

AbstractConventional coalescent inferences of population history make the critical assumption that the population under examination is panmictic. However, most populations are structured. This complicates the prevailing coalescent analyses and sometimes leads to inaccurate estimates. To develop a coalescent method unhampered by population structure, we perform two analyses. First, we demonstrate that the coalescent probability of two randomly sampled alleles from the immediate preceding generation (one generation back) is independent of population structure. Second, motivated by this finding, we propose a new coalescent method: i-coalescent analysis. i-coalescent analysis computes the instantaneous coalescent rate (iCR) by using a phylogenetic tree of sampled alleles. Using simulated data, we broadly demonstrate the capability of i-coalescent analysis to accurately reconstruct population size dynamics of highly structured populations, although we find this method often requires larger sample sizes for structured populations than for panmictic populations. Overall, our results indicate i-coalescent analysis to be a useful tool, especially for the inference of population histories with intractable structure such as the developmental history of cell populations in the organs of complex organisms.


2019 ◽  
Author(s):  
Qiqing Tao ◽  
Koichiro Tamura ◽  
Beatriz Mello ◽  
Sudhir Kumar

AbstractConfidence intervals (CIs) depict the statistical uncertainty surrounding evolutionary divergence time estimates. They capture variance contributed by the finite number of sequences and sites used in the alignment, deviations of evolutionary rates from a strict molecular clock in a phylogeny, and uncertainty associated with clock calibrations. Reliable tests of biological hypotheses demand reliable CIs. However, current non-Bayesian methods may produce unreliable CIs because they do not incorporate rate variation among lineages and interactions among clock calibrations properly. Here, we present a new analytical method to calculate CIs of divergence times estimated using the RelTime method, along with an approach to utilize multiple calibration uncertainty densities in these analyses. Empirical data analyses showed that the new methods produce CIs that overlap with Bayesian highest posterior density (HPD) intervals. In the analysis of computer-simulated data, we found that RelTime CIs show excellent average coverage probabilities, i.e., the true time is contained within the CIs with a 95% probability. These developments will encourage broader use of computationally-efficient RelTime approach in molecular dating analyses and biological hypothesis testing.


1998 ◽  
Vol 09 (01) ◽  
pp. 71-85 ◽  
Author(s):  
A. Bevilacqua ◽  
D. Bollini ◽  
R. Campanini ◽  
N. Lanconelli ◽  
M. Galli

This study investigates the possibility of using an Artificial Neural Network (ANN) for reconstructing Positron Emission Tomography (PET) images. The network is trained with simulated data which include physical effects such as attenuation and scattering. Once the training ends, the weights of the network are held constant. The network is able to reconstruct every type of source distribution contained inside the area mapped during the learning. The reconstruction of a simulated brain phantom in a noiseless case shows an improvement if compared with Filtered Back-Projection reconstruction (FBP). In noisy cases there is still an improvement, even if we do not compensate for noise fluctuations. These results show that it is possible to reconstruct PET images using ANNs. Initially we used a Dec Alpha; then, due to the high data parallelism of this reconstruction problem, we ported the learning on a Quadrics (SIMD) machine, suited for the realization of a small medical dedicated system. These results encourage us to continue in further studies that will make possible reconstruction of images of bigger dimension than those used in the present work (32 × 32 pixels).


2016 ◽  
Vol 29 (6) ◽  
pp. 1977-1998 ◽  
Author(s):  
Alexis Hannart

Abstract The present paper introduces and illustrates methodological developments intended for so-called optimal fingerprinting methods, which are of frequent use in detection and attribution studies. These methods used to involve three independent steps: preliminary reduction of the dimension of the data, estimation of the covariance associated to internal climate variability, and, finally, linear regression inference with associated uncertainty assessment. It is argued that such a compartmentalized treatment presents several issues; an integrated method is thus introduced to address them. The suggested approach is based on a single-piece statistical model that represents both linear regression and control runs. The unknown covariance is treated as a nuisance parameter that is eliminated by integration. This allows for the introduction of regularization assumptions. Point estimates and confidence intervals follow from the integrated likelihood. Further, it is shown that preliminary dimension reduction is not required for implementability and that computational issues associated to using the raw, high-dimensional, spatiotemporal data can be resolved quite easily. Results on simulated data show improved performance compared to existing methods w.r.t. both estimation error and accuracy of confidence intervals and also highlight the need for further improvements regarding the latter. The method is illustrated on twentieth-century precipitation and surface temperature, suggesting a potentially high informational benefit of using the raw, nondimension-reduced data in detection and attribution (D&A), provided model error is appropriately built into the inference.


2020 ◽  
Author(s):  
Benedict King

Abstract The incorporation of stratigraphic data into phylogenetic analysis has a long history of debate but is not currently standard practice for paleontologists. Bayesian tip-dated (or morphological clock) phylogenetic methods have returned these arguments to the spotlight, but how tip dating affects the recovery of evolutionary relationships has yet to be fully explored. Here I show, through analysis of several data sets with multiple phylogenetic methods, that topologies produced by tip dating are outliers as compared to topologies produced by parsimony and undated Bayesian methods, which retrieve broadly similar trees. Unsurprisingly, trees recovered by tip dating have better fit to stratigraphy than trees recovered by other methods under both the Gap Excess Ratio (GER) and the Stratigraphic Completeness Index (SCI). This is because trees with better stratigraphic fit are assigned a higher likelihood by the fossilized birth-death tree model. However, the degree to which the tree model favors tree topologies with high stratigraphic fit metrics is modulated by the diversification dynamics of the group under investigation. In particular, when net diversification rate is low, the tree model favors trees with a higher GER compared to when net diversification rate is high. Differences in stratigraphic fit and tree topology between tip dating and other methods are concentrated in parts of the tree with weaker character signal, as shown by successive deletion of the most incomplete taxa from two data sets. These results show that tip dating incorporates stratigraphic data in an intuitive way, with good stratigraphic fit an expectation that can be overturned by strong evidence from character data. [fossilized birth-death; fossils; missing data; morphological clock; morphology; parsimony; phylogenetics.]


Paleobiology ◽  
2010 ◽  
Vol 36 (1) ◽  
pp. 16-31 ◽  
Author(s):  
Matthew M. Hedman

This paper presents a method for constraining the age of a clade with the ages of the earliest fossil specimens in that clade's outgroups. Given a sufficiently deep, robust, well-resolved, and stratigraphically consistent cladogram, this method can yield useful age constraints even in the absence of specific information about the fossil preservation and recovery rates of individual taxa. The algorithm is applied to simulated data sets to demonstrate that this method can yield robust constraints of clade ages if there are sufficient fossil outgroups available and if there is a finite chance that additional outgroups may be discovered in the future. Finally, the technique is applied to actual fossil data to explore the origin of modern placental mammals. Using data from recently published cladograms, this method indicates that if all Mesozoic eutherians are regarded as outgroups of Placentalia, then the last common ancestor of modern placental mammals and their Cenozoic allies lived between 65 and 88–98 million years ago, depending on the assumed cladogram and the number of outgroups included in the analysis.


2013 ◽  
Vol 50 (01) ◽  
pp. 114-126 ◽  
Author(s):  
Hanjun Zhang ◽  
Yixia Zhu

We consider a birth–death process {X(t),t≥0} on the positive integers for which the origin is an absorbing state with birth coefficients λ n ,n≥0, and death coefficients μ n ,n≥0. If we define A=∑ n=1 ∞ 1/λ n π n and S=∑ n=1 ∞ (1/λ n π n )∑ i=n+1 ∞ π i , where {π n ,n≥1} are the potential coefficients, it is a well-known fact (see van Doorn (1991)) that if A=∞ and S<∞, then λ C >0 and there is precisely one quasistationary distribution, namely, {a j (λ C )}, where λ C is the decay parameter of {X(t),t≥0} in C={1,2,...} and a j (x)≡μ1 -1π j xQ j (x), j=1,2,.... In this paper we prove that there is a unique quasistationary distribution that attracts all initial distributions supported in C, if and only if the birth–death process {X(t),t≥0} satisfies bothA=∞ and S<∞. That is, for any probability measure M={m i , i=1,2,...}, we have lim t→∞ℙ M (X(t)=j∣ T>t)= a j (λ C ), j=1,2,..., where T=inf{t≥0 : X(t)=0} is the extinction time of {X(t),t≥0} if and only if the birth–death process {X(t),t≥0} satisfies both A=∞ and S<∞.


1983 ◽  
Vol 40 (12) ◽  
pp. 2153-2169 ◽  
Author(s):  
Jon Schnute

This paper presents a new approach to the use of removal data in estimating the size of a population of fish or other animals. The theory admits a variety of assumptions on how catchability varies among fishings including the assumption of constant catchability, which underlies most previous work. The methods here hinge on maximum likelihood estimation, and they can be used both to decide objectively if the data justify rejecting constant catchability and to determine confidence intervals for the parameters. The work includes a new method of assigning confidence to the population estimate and points out problems with methods currently available in the literature, even in the case of constant catchability. The theory is applied both to data in historical literature and to more recent data from streams in New Brunswick, Canada. These examples demonstrate that the assumption of constant catchability can frequently lead to serious errors in data interpretation. In some cases, the conclusion that the population size is well known may be blatantly false, and reasonable estimates may be impossible without further data.


Sign in / Sign up

Export Citation Format

Share Document