Likelihood ratios for the infinite alleles model

1994 ◽  
Vol 31 (3) ◽  
pp. 595-605 ◽  
Author(s):  
Paul Joyce

The stationary distribution for the population frequencies under an infinite alleles model is described as a random sequence (x1, x2, · ··) such that Σxi = 1. Likelihood ratio theory is developed for random samples drawn from such populations. As a result of the theory, it is shown that any parameter distinguishing an infinite alleles model with selection from the neutral infinite alleles model cannot be consistently estimated based on gene frequencies at a single locus. Furthermore, the likelihood ratio (neutral versus selection) converges to a non-trivial random variable under both hypotheses. This shows that if one wishes to test a completely specified infinite alleles model with selection against neutrality, the test will not obtain power 1 in the limit.

1994 ◽  
Vol 31 (03) ◽  
pp. 595-605 ◽  
Author(s):  
Paul Joyce

The stationary distribution for the population frequencies under an infinite alleles model is described as a random sequence (x 1, x 2, · ··) such that Σxi = 1. Likelihood ratio theory is developed for random samples drawn from such populations. As a result of the theory, it is shown that any parameter distinguishing an infinite alleles model with selection from the neutral infinite alleles model cannot be consistently estimated based on gene frequencies at a single locus. Furthermore, the likelihood ratio (neutral versus selection) converges to a non-trivial random variable under both hypotheses. This shows that if one wishes to test a completely specified infinite alleles model with selection against neutrality, the test will not obtain power 1 in the limit.


Entropy ◽  
2019 ◽  
Vol 21 (1) ◽  
pp. 63 ◽  
Author(s):  
Michel Broniatowski ◽  
Jana Jurečková ◽  
Ashok Moses ◽  
Emilie Miranda

This paper focuses on test procedures under corrupted data. We assume that the observations Z i are mismeasured, due to the presence of measurement errors. Thus, instead of Z i for i = 1 , … , n, we observe X i = Z i + δ V i, with an unknown parameter δ and an unobservable random variable V i. It is assumed that the random variables Z i are i.i.d., as are the X i and the V i. The test procedure aims at deciding between two simple hyptheses pertaining to the density of the variable Z i, namely f 0 and g 0. In this setting, the density of the V i is supposed to be known. The procedure which we propose aggregates likelihood ratios for a collection of values of δ. A new definition of least-favorable hypotheses for the aggregate family of tests is presented, and a relation with the Kullback-Leibler divergence between the sets f δ δ and g δ δ is presented. Finite-sample lower bounds for the power of these tests are presented, both through analytical inequalities and through simulation under the least-favorable hypotheses. Since no optimality holds for the aggregation of likelihood ratio tests, a similar procedure is proposed, replacing the individual likelihood ratio by some divergence based test statistics. It is shown and discussed that the resulting aggregated test may perform better than the aggregate likelihood ratio procedure.


1997 ◽  
Vol 11 (3) ◽  
pp. 395-402 ◽  
Author(s):  
Jorge Navarro ◽  
Felix Belzunce ◽  
Jose M. Ruiz

The purpose of this paper is to study definitions and characterizations of orders based on reliability measures related with the doubly truncated random variable X[x, y] = (X|x ≤ X ≤ y). The relationship between these orderings and various existing orderings of life distributions are discussed. Moreover, we give two new characterizations of the likelihood ratio order based on double truncation. These new orders complete a general diagram between orders defined from truncation.


Universe ◽  
2021 ◽  
Vol 7 (6) ◽  
pp. 174
Author(s):  
Karl Wette

The likelihood ratio for a continuous gravitational wave signal is viewed geometrically as a function of the orientation of two vectors; one representing the optimal signal-to-noise ratio, and the other representing the maximised likelihood ratio or F-statistic. Analytic marginalisation over the angle between the vectors yields a marginalised likelihood ratio, which is a function of the F-statistic. Further analytic marginalisation over the optimal signal-to-noise ratio is explored using different choices of prior. Monte-Carlo simulations show that the marginalised likelihood ratios had identical detection power to the F-statistic. This approach demonstrates a route to viewing the F-statistic in a Bayesian context, while retaining the advantages of its efficient computation.


Genetics ◽  
2000 ◽  
Vol 155 (2) ◽  
pp. 499-508 ◽  
Author(s):  
Bruce Rannala ◽  
Wei-Gang Qiu ◽  
Daniel E Dykhuizen

Abstract Recent breakthroughs in molecular technology, most significantly the polymerase chain reaction (PCR) and in situ hybridization, have allowed the detection of genetic variation in bacterial communities without prior cultivation. These methods often produce data in the form of the presence or absence of alleles or genotypes, however, rather than counts of alleles. Using relative allele frequencies from presence-absence data as estimates of population allele frequencies tends to underestimate the frequencies of common alleles and overestimate those of rare ones, potentially biasing the results of a test of neutrality in favor of balancing selection. In this study, a maximum-likelihood estimator (MLE) of bacterial allele frequencies designed for use with presence-absence data is derived using an explicit stochastic model of the host infection (or bacterial sampling) process. The performance of the MLE is evaluated using computer simulation and a method is presented for evaluating the fit of estimated allele frequencies to the neutral infinite alleles model (IAM). The methods are applied to estimate allele frequencies at two outer surface protein loci (ospA and ospC) of the Lyme disease spirochete, Borrelia burgdorferi, infecting local populations of deer ticks (Ixodes scapularis) and to test the fit to a neutral IAM.


2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Kjell Torén ◽  
Linus Schiöler ◽  
Jonas Brisman ◽  
Andrei Malinovschi ◽  
Anna-Carin Olin ◽  
...  

Abstract Background There is low diagnostic accuracy of the proxy restrictive spirometric pattern (RSP) to identify true pulmonary restriction. This knowledge is based on patients referred for spirometry and total lung volume determination by plethysmograpy, single breath nitrogen washout technique or gas dilution and selected controls. There is, however, a lack of data from general populations analyzing whether RSP is a valid proxy for true pulmonary restriction. We have validated RSP in relation to true pulmonary restriction in a general population where we have access to measurements of total lung capacity (TLC) and spirometry. Methods The data was from the Swedish CArdioPulmonary bioImage Study (SCAPIS Pilot), a general population-based study, comprising 983 adults aged 50–64. All subjects answered a respiratory questionnaire. Forced expiratory volume in 1 s (FEV1) and forced vital capacity (FVC) were obtained before and after bronchodilation. TLC and residual volume (RV) was recorded using a body plethysmograph. All lung function values are generally expressed as percent predicted (% predicted) or in relation to lower limits of normal (LLN). True pulmonary restriction was defined as TLC < LLN5 defined as a Z score < − 1.645, i e the fifth percentile. RSP was defined as FEV1/FVC ≥ LLN and FVC < LLN after bronchodilation. Specificity, sensitivity, positive and negative likelihood ratios were calculated, and 95% confidence intervals (CIs) were calculated. Results The prevalence of true pulmonary restriction was 5.4%, and the prevalence of RSP was 3.4%. The sensitivity of RSP to identify true pulmonary restriction was 0.34 (0.20–0.46), the corresponding specificity was 0.98 (0.97–0.99), and the positive likelihood ratio was 21.1 (11.3–39.4) and the negative likelihood ratio was 0.67 (0.55–0.81). Conclusions RSP has low accuracy for identifying true pulmonary restriction. The results support previous observations that RSP is useful for ruling out true pulmonary restriction.


2019 ◽  
Vol 36 (4) ◽  
pp. 253-8
Author(s):  
Ysel Cabrera ◽  
Melissa Martínez ◽  
Almendra Cabello ◽  
Almendra Cabello ◽  
Samuel Pereyra ◽  
...  

Objetivo: Evaluar la alanina aminotransferasa (ALT) como marcador en el diagnóstico de síndrome metabólico (SM) y riesgo cardiovascular (RCV) en niños con obesidad exógena.Materiales y métodos: Estudio transversal. Se incluyeron niños con obesidad exógena de 2 a 14 años, atendidos en la Unidad de Endocrinología Pediátrica del Hospital NacionalCayetano Heredia, entre el 2014 al 2018. Se definió enfermedad hepática no alcohólica (EHNA) considerando dos puntos de corte de ALT; en mujeres: >22,1U/L y >44U/L, en varones: >25,8U/L y >50U/L. Se definió SM según la Academia Americana de Pediatría y RCV con TG/HDL-C ≥3,5. Aplicamos Chi cuadrado, considerándose significativo p<0,05. Estimamos sensibilidad(S), especificidad (E) y likelihood ratios (LR). Resultados: Se incluyeron 347 niños obesos (54,7% varones). La frecuencia de EHNA fue de 23,1%. La sensibilidad y especificidad para el diagnóstico de SM con ALT >22,1U/L y >25,8U/L fue 79,4% y 37,6% respectivamente y, con ALT>44U/L y >50U/L fue 28,6% y 83,3%. La ALT con punto de corte mayor en conjunto con TG/HDL-C≥3,5 mostró una especificidad del 96,9% y un likelihood ratio + (LR+) de 6,7. Conclusión: La ALT con un punto de corte >44U/L en mujeres y >50U/L en varones, es un marcador bioquímico útil en la identificación de SM y riesgo cardiovascular en niños con obesidad exógena desde los primeros años de vida.


1978 ◽  
Vol 110 (12) ◽  
pp. 1241-1246 ◽  
Author(s):  
R. H. Gooding ◽  
B. M. Rolseth

AbstractThe digestive section of the midgut of Glossina morsitans morsitans Westwood contains a phosphatase with a pH optimum of approximately 9.2 and with low substrate specificity; the enzyme was classified as an alkaline phosphatase (E.C. 3.1.3.1).Polyacrylamide gel (6%) electrophoresis (at pH 8.9) of the digestive portion of the midguts of adult G. morsitans morsitans revealed three alkaline phophatase phenotypes. Midgut phosphatase was postulated to be under control of a single locus (designated alkph) with two alleles. Gene frequencies were in Hardy-Weinberg equilibrium in two laboratory populations while a third, highly inbred population had only one phenotype. Phenotype frequencies were not significantly different among females of various ages from the Edmonton colony. Breeding experiments provided direct evidence for single locus control of midgut phosphatase.


Author(s):  
Shunichi Ishihara

This study is one of the first likelihood ratio-based forensic text comparison studies in forensic authorship analysis. The likelihood-ratio-based evaluation of scientific evidence has started being adopted in many disciplines of forensic evidence comparison sciences, such as DNA, handwriting, fingerprints, footwear, voice recording, etc., and it is largely accepted that this is the way to ensure the maximum accountability and transparency of the process. Due to its convenience and low cost, short message service (SMS) has been a very popular medium of communication for quite some time. Unfortunately, however, SMS messages are sometimes used for reprehensible purposes, e.g., communication between drug dealers and buyers, or in illicit acts such as extortion, fraud, scams, hoaxes, and false reports of terrorist threats. In this study, the author performs a likelihood-ratio-based forensic text comparison of SMS messages focusing on lexical features. The likelihood ratios (LRs) are calculated in Aitken and Lucy’s (2004) multivariate kernel density procedure, and are calibrated. The validity of the system is assessed based on the magnitude of the LRs using the log-likelihood-ratio cost (Cllr). The strength of the derived LRs is graphically presented in Tippett plots. The results of the current study are compared with those of previous studies.


1989 ◽  
Vol 19 (1) ◽  
pp. 71-90 ◽  
Author(s):  
François Dufresne ◽  
Hans U. Gerber

AbstractThe first method, essentially due to GOOVAERTS and DE VYLDER, uses the connection between the probability of ruin and the maximal aggregate loss random variable, and the fact that the latter has a compound geometric distribution. For the second method, the claim amount distribution is supposed to be a combination of exponential or translated exponential distributions. Then the probability of ruin can be calculated in a transparent fashion; the main problem is to determine the nontrivial roots of the equation that defines the adjustment coefficient. For the third method one observes that the probability, of ruin is related to the stationary distribution of a certain associated process. Thus it can be determined by a single simulation of the latter. For the second and third methods the assumption of only proper (positive) claims is not needed.


Sign in / Sign up

Export Citation Format

Share Document