scholarly journals Low concordance of short-term and long-term selection responses in experimental Drosophila populations

2019 ◽  
Author(s):  
Anna Maria Langmüller ◽  
Christian Schlötterer

AbstractExperimental evolution is becoming a popular approach to study the genomic selection response of evolving populations. Computer simulation studies suggest that the accuracy of the signature increases with the duration of the experiment. Since some assumptions of the computer simulations may be violated, it is important to scrutinize the influence of the experimental duration with real data. Here, we use a highly replicated Evolve and Resequence study in Drosophila simulans to compare the selection targets inferred at different time points. At each time point, approximately the same number of SNPs deviates from neutral expectations, but only 10 % of the selected haplotype blocks identified from the full data set can be detected after 20 generations. Those haplotype blocks that emerge already after 20 generations differ from the others by being strongly selected at the beginning of the experiment and display a more parallel selection response. Consistent with previous computer simulations, our results demonstrate that only Evolve and Resequence experiments with a sufficient number of generations can characterize complex adaptive architectures.

1999 ◽  
Vol 32 (3) ◽  
pp. 554-562 ◽  
Author(s):  
Steffi Arzt ◽  
John W. Campbell ◽  
Marjorie M. Harding ◽  
Q. Hao ◽  
John R. Helliwell

A new program in the DaresburyLauesoftware suite has been developed for the scaling and normalization of Laue intensity data, to yield fully corrected structure amplitudes. Previously available routines have been improved, and additional options for refinement, control and statistical diagnostic output provided. A new feature, namely a wavelength- and position-dependent absorption correction that models a two-dimensional surface derived from the Laue data alone, is discussed in detail; it is tested on simulated and real data, and the improvement in data quality is demonstrated. The wavelength normalization function is now able, when sufficiently redundant experimental data are available, to model fine details such as the features arising from the modification of the incident intensity spectrum by a platinum mirror in the beamline optics. A full data set for tetragonal lysozyme is processed with the new program, and extensive statistical output is given.


2019 ◽  
Vol XVI (2) ◽  
pp. 1-11
Author(s):  
Farrukh Jamal ◽  
Hesham Mohammed Reyad ◽  
Soha Othman Ahmed ◽  
Muhammad Akbar Ali Shah ◽  
Emrah Altun

A new three-parameter continuous model called the exponentiated half-logistic Lomax distribution is introduced in this paper. Basic mathematical properties for the proposed model were investigated which include raw and incomplete moments, skewness, kurtosis, generating functions, Rényi entropy, Lorenz, Bonferroni and Zenga curves, probability weighted moment, stress strength model, order statistics, and record statistics. The model parameters were estimated by using the maximum likelihood criterion and the behaviours of these estimates were examined by conducting a simulation study. The applicability of the new model is illustrated by applying it on a real data set.


Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


2019 ◽  
Vol 14 (2) ◽  
pp. 148-156
Author(s):  
Nighat Noureen ◽  
Sahar Fazal ◽  
Muhammad Abdul Qadir ◽  
Muhammad Tanvir Afzal

Background: Specific combinations of Histone Modifications (HMs) contributing towards histone code hypothesis lead to various biological functions. HMs combinations have been utilized by various studies to divide the genome into different regions. These study regions have been classified as chromatin states. Mostly Hidden Markov Model (HMM) based techniques have been utilized for this purpose. In case of chromatin studies, data from Next Generation Sequencing (NGS) platforms is being used. Chromatin states based on histone modification combinatorics are annotated by mapping them to functional regions of the genome. The number of states being predicted so far by the HMM tools have been justified biologically till now. Objective: The present study aimed at providing a computational scheme to identify the underlying hidden states in the data under consideration. </P><P> Methods: We proposed a computational scheme HCVS based on hierarchical clustering and visualization strategy in order to achieve the objective of study. Results: We tested our proposed scheme on a real data set of nine cell types comprising of nine chromatin marks. The approach successfully identified the state numbers for various possibilities. The results have been compared with one of the existing models as well which showed quite good correlation. Conclusion: The HCVS model not only helps in deciding the optimal state numbers for a particular data but it also justifies the results biologically thereby correlating the computational and biological aspects.


Author(s):  
Peter R. Monge ◽  
Noshir Contractor

To date, most network research contains one or more of five major problems. First, it tends to be atheoretical, ignoring the various social theories that contain network implications. Second, it explores single levels of analysis rather than the multiple levels out of which most networks are comprised. Third, network analysis has employed very little the insights from contemporary complex systems analysis and computer simulations. Foruth, it typically uses descriptive rather than inferential statistics, thus robbing it of the ability to make claims about the larger universe of networks. Finally, almost all the research is static and cross-sectional rather than dynamic. Theories of Communication Networks presents solutions to all five problems. The authors develop a multitheoretical model that relates different social science theories with different network properties. This model is multilevel, providing a network decomposition that applies the various social theories to all network levels: individuals, dyads, triples, groups, and the entire network. The book then establishes a model from the perspective of complex adaptive systems and demonstrates how to use Blanche, an agent-based network computer simulation environment, to generate and test network theories and hypotheses. It presents recent developments in network statistical analysis, the p* family, which provides a basis for valid multilevel statistical inferences regarding networks. Finally, it shows how to relate communication networks to other networks, thus providing the basis in conjunction with computer simulations to study the emergence of dynamic organizational networks.


2021 ◽  
Vol 13 (9) ◽  
pp. 1703
Author(s):  
He Yan ◽  
Chao Chen ◽  
Guodong Jin ◽  
Jindong Zhang ◽  
Xudong Wang ◽  
...  

The traditional method of constant false-alarm rate detection is based on the assumption of an echo statistical model. The target recognition accuracy rate and the high false-alarm rate under the background of sea clutter and other interferences are very low. Therefore, computer vision technology is widely discussed to improve the detection performance. However, the majority of studies have focused on the synthetic aperture radar because of its high resolution. For the defense radar, the detection performance is not satisfactory because of its low resolution. To this end, we herein propose a novel target detection method for the coastal defense radar based on faster region-based convolutional neural network (Faster R-CNN). The main processing steps are as follows: (1) the Faster R-CNN is selected as the sea-surface target detector because of its high target detection accuracy; (2) a modified Faster R-CNN based on the characteristics of sparsity and small target size in the data set is employed; and (3) soft non-maximum suppression is exploited to eliminate the possible overlapped detection boxes. Furthermore, detailed comparative experiments based on a real data set of coastal defense radar are performed. The mean average precision of the proposed method is improved by 10.86% compared with that of the original Faster R-CNN.


2021 ◽  
Vol 1978 (1) ◽  
pp. 012047
Author(s):  
Xiaona Sheng ◽  
Yuqiu Ma ◽  
Jiabin Zhou ◽  
Jingjing Zhou

2021 ◽  
pp. 1-11
Author(s):  
Velichka Traneva ◽  
Stoyan Tranev

Analysis of variance (ANOVA) is an important method in data analysis, which was developed by Fisher. There are situations when there is impreciseness in data In order to analyze such data, the aim of this paper is to introduce for the first time an intuitionistic fuzzy two-factor ANOVA (2-D IFANOVA) without replication as an extension of the classical ANOVA and the one-way IFANOVA for a case where the data are intuitionistic fuzzy rather than real numbers. The proposed approach employs the apparatus of intuitionistic fuzzy sets (IFSs) and index matrices (IMs). The paper also analyzes a unique set of data on daily ticket sales for a year in a multiplex of Cinema City Bulgaria, part of Cineworld PLC Group, applying the two-factor ANOVA and the proposed 2-D IFANOVA to study the influence of “ season ” and “ ticket price ” factors. A comparative analysis of the results, obtained after the application of ANOVA and 2-D IFANOVA over the real data set, is also presented.


Genetics ◽  
1998 ◽  
Vol 149 (3) ◽  
pp. 1547-1555 ◽  
Author(s):  
Wouter Coppieters ◽  
Alexandre Kvasz ◽  
Frédéric Farnir ◽  
Juan-Jose Arranz ◽  
Bernard Grisart ◽  
...  

Abstract We describe the development of a multipoint nonparametric quantitative trait loci mapping method based on the Wilcoxon rank-sum test applicable to outbred half-sib pedigrees. The method has been evaluated on a simulated dataset and its efficiency compared with interval mapping by using regression. It was shown that the rank-based approach is slightly inferior to regression when the residual variance is homoscedastic normal; however, in three out of four other scenarios envisaged, i.e., residual variance heteroscedastic normal, homoscedastic skewed, and homoscedastic positively kurtosed, the latter outperforms the former one. Both methods were applied to a real data set analyzing the effect of bovine chromosome 6 on milk yield and composition by using a 125-cM map comprising 15 microsatellites and a granddaughter design counting 1158 Holstein-Friesian sires.


Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 934
Author(s):  
Yuxuan Zhang ◽  
Kaiwei Liu ◽  
Wenhao Gui

For the purpose of improving the statistical efficiency of estimators in life-testing experiments, generalized Type-I hybrid censoring has lately been implemented by guaranteeing that experiments only terminate after a certain number of failures appear. With the wide applications of bathtub-shaped distribution in engineering areas and the recently introduced generalized Type-I hybrid censoring scheme, considering that there is no work coalescing this certain type of censoring model with a bathtub-shaped distribution, we consider the parameter inference under generalized Type-I hybrid censoring. First, estimations of the unknown scale parameter and the reliability function are obtained under the Bayesian method based on LINEX and squared error loss functions with a conjugate gamma prior. The comparison of estimations under the E-Bayesian method for different prior distributions and loss functions is analyzed. Additionally, Bayesian and E-Bayesian estimations with two unknown parameters are introduced. Furthermore, to verify the robustness of the estimations above, the Monte Carlo method is introduced for the simulation study. Finally, the application of the discussed inference in practice is illustrated by analyzing a real data set.


Sign in / Sign up

Export Citation Format

Share Document