scholarly journals Clear: Composition of Likelihoods for Evolve And Resequence Experiments

2016 ◽  
Author(s):  
Arya Iranmehr ◽  
Ali Akbari ◽  
Christian Schlötterer ◽  
Vineet Bafna

AbstractThe advent of next generation sequencing technologies has made whole-genome and whole-population sampling possible, even for eukaryotes with large genomes. With this development, experimental evolution studies can be designed to observe molecular evolution “in-action” via Evolve-and-Resequence (E&R) experiments. Among other applications, E&R studies can be used to locate the genes and variants responsible for genetic adaptation. Existing literature on time-series data analysis often assumes large population size, accurate allele frequency estimates, and wide time spans. These assumptions do not hold in many E&R studies.In this article, we propose a method-Composition of Likelihoods for Evolve-And-Resequence experiments (Clear)–to identify signatures of selection in small population E&R experiments. Clear takes whole-genome sequence of pool of individuals (pool-seq) as input, and properly addresses heterogeneous ascertainment bias resulting from uneven coverage. Clear also provides unbiased estimates of model parameters, including population size, selection strength and dominance, while being computationally efficient. Extensive simulations show that Clear achieves higher power in detecting and localizing selection over a wide range of parameters, and is robust to variation of coverage. We applied Clear statistic to multiple E&R experiments, including, data from a study of D. melanogaster adaptation to alternating temperatures and a study of outcrossing yeast populations, and identified multiple regions under selection with genome-wide significance.

2014 ◽  
Author(s):  
Jonathan Terhorst ◽  
Yun S. Song

Genomic time series data generated by evolve-and-resequence (E&R) experiments offer a powerful window into the mechanisms that drive evolution. However, standard population genetic inference procedures do not account for sampling serially over time, and new methods are needed to make full use of modern experimental evolution data. To address this problem, we develop a Gaussian process approximation to the multi-locus Wright-Fisher process with selection over a time course of tens of generations. The mean and covariance structure of the Gaussian process are obtained by computing the corresponding moments in discrete-time Wright-Fisher models conditioned on the presence of a linked selected site. This enables our method to account for the effects of linkage and selection, both along the genome and across sampled time points, in an approximate but principled manner. Using simulated data, we demonstrate the power of our method to correctly detect, locate and estimate the fitness of a selected allele from among several linked sites. We also study how this power changes for different values of selection strength, initial haplotypic diversity, population size, sampling frequency, experimental duration, number of replicates, and sequencing coverage depth. In addition to providing quantitative estimates of selection parameters from experimental evolution data, our model can be used by practitioners to design E&R experiments with requisite power. Finally, we explore how our likelihood-based approach can be used to infer other model parameters, including effective population size and recombination rate, and discuss extensions to more complex models.


Author(s):  
Edward J. Oughton

Space weather is a collective term for different solar or space phenomena that can detrimentally affect technology. However, current understanding of space weather hazards is still relatively embryonic in comparison to terrestrial natural hazards such as hurricanes, earthquakes, or tsunamis. Indeed, certain types of space weather such as large Coronal Mass Ejections (CMEs) are an archetypal example of a low-probability, high-severity hazard. Few major events, short time-series data, and the lack of consensus regarding the potential impacts on critical infrastructure have hampered the economic impact assessment of space weather. Yet, space weather has the potential to disrupt a wide range of Critical National Infrastructure (CNI) systems including electricity transmission, satellite communications and positioning, aviation, and rail transportation. In the early 21st century, there has been growing interest in these potential economic and societal impacts. Estimates range from millions of dollars of equipment damage from the Quebec 1989 event, to some analysts asserting that losses will be in the billions of dollars in the wider economy from potential future disaster scenarios. Hence, the origin and development of the socioeconomic evaluation of space weather is tracked, from 1989 to 2017, and future research directions for the field are articulated. Since 1989, many economic analyzes of space weather hazards have often completely overlooked the physical impacts on infrastructure assets and the topology of different infrastructure networks. Moreover, too many studies have relied on qualitative assumptions about the vulnerability of CNI. By modeling both the vulnerability of critical infrastructure and the socioeconomic impacts of failure, the total potential impacts of space weather can be estimated, providing vital information for decision makers in government and industry. Efforts on this subject have historically been relatively piecemeal, which has led to little exploration of model sensitivities, particularly in relation to different assumption sets about infrastructure failure and restoration. Improvements may be expedited in this research area by open-sourcing model code, increasing the existing level of data sharing, and improving multidisciplinary research collaborations between scientists, engineers, and economists.


Author(s):  
Frank Dobbin ◽  
Alexandra Kalev

Corporations have implemented a wide range of equal opportunity and diversity programs since the 1960s. This chapter reviews studies of the origins of these programs, surveys that assess the popularity of different programs, and research on the effects of programs on the workforce. Human resources managers championed several waves of innovations: corporate equal opportunity policies and recruitment and training programs in the 1960s; bureaucratic hiring and promotion policies and grievance mechanisms in the 1970s; diversity training, networking, and mentoring programs in the 1980s; and work/family and sexual harassment programs in the 1990s and beyond. It was those managers who designed equal opportunity and diversity programs, not lawyers or judges or government bureaucrats, thus corporate take-up of the programs remains very uneven. Statistical analyses of time-series data on the effects of corporate diversity measures reveal several patterns. Initiatives designed to quash managerial bias, through diversity training, diversity performance evaluations, and bureaucratic rules, have been broadly ineffective. By contrast, innovations designed to engage managers in promoting workforce integration—mentoring programs, diversity taskforces, and full-time diversity staffers—have led to increases in diversity in the most difficult job to integrate, management. The research has clear implications for corporate and public policy.


2020 ◽  
Vol 109 (11) ◽  
pp. 2029-2061
Author(s):  
Zahraa S. Abdallah ◽  
Mohamed Medhat Gaber

Abstract Time series classification (TSC) is a challenging task that attracted many researchers in the last few years. One main challenge in TSC is the diversity of domains where time series data come from. Thus, there is no “one model that fits all” in TSC. Some algorithms are very accurate in classifying a specific type of time series when the whole series is considered, while some only target the existence/non-existence of specific patterns/shapelets. Yet other techniques focus on the frequency of occurrences of discriminating patterns/features. This paper presents a new classification technique that addresses the inherent diversity problem in TSC using a nature-inspired method. The technique is stimulated by how flies look at the world through “compound eyes” that are made up of thousands of lenses, called ommatidia. Each ommatidium is an eye with its own lens, and thousands of them together create a broad field of vision. The developed technique similarly uses different lenses and representations to look at the time series, and then combines them for broader visibility. These lenses have been created through hyper-parameterisation of symbolic representations (Piecewise Aggregate and Fourier approximations). The algorithm builds a random forest for each lens, then performs soft dynamic voting for classifying new instances using the most confident eyes, i.e., forests. We evaluate the new technique, coined Co-eye, using the recently released extended version of UCR archive, containing more than 100 datasets across a wide range of domains. The results show the benefits of bringing together different perspectives reflecting on the accuracy and robustness of Co-eye in comparison to other state-of-the-art techniques.


Information ◽  
2019 ◽  
Vol 10 (12) ◽  
pp. 390 ◽  
Author(s):  
Ahmad Hassanat ◽  
Khalid Almohammadi ◽  
Esra’a Alkafaween ◽  
Eman Abunawas ◽  
Awni Hammouri ◽  
...  

Genetic algorithm (GA) is an artificial intelligence search method that uses the process of evolution and natural selection theory and is under the umbrella of evolutionary computing algorithm. It is an efficient tool for solving optimization problems. Integration among (GA) parameters is vital for successful (GA) search. Such parameters include mutation and crossover rates in addition to population that are important issues in (GA). However, each operator of GA has a special and different influence. The impact of these factors is influenced by their probabilities; it is difficult to predefine specific ratios for each parameter, particularly, mutation and crossover operators. This paper reviews various methods for choosing mutation and crossover ratios in GAs. Next, we define new deterministic control approaches for crossover and mutation rates, namely Dynamic Decreasing of high mutation ratio/dynamic increasing of low crossover ratio (DHM/ILC), and Dynamic Increasing of Low Mutation/Dynamic Decreasing of High Crossover (ILM/DHC). The dynamic nature of the proposed methods allows the ratios of both crossover and mutation operators to be changed linearly during the search progress, where (DHM/ILC) starts with 100% ratio for mutations, and 0% for crossovers. Both mutation and crossover ratios start to decrease and increase, respectively. By the end of the search process, the ratios will be 0% for mutations and 100% for crossovers. (ILM/DHC) worked the same but the other way around. The proposed approach was compared with two parameters tuning methods (predefined), namely fifty-fifty crossover/mutation ratios, and the most common approach that uses static ratios such as (0.03) mutation rates and (0.9) crossover rates. The experiments were conducted on ten Traveling Salesman Problems (TSP). The experiments showed the effectiveness of the proposed (DHM/ILC) when dealing with small population size, while the proposed (ILM/DHC) was found to be more effective when using large population size. In fact, both proposed dynamic methods outperformed the predefined methods compared in most cases tested.


2020 ◽  
Vol 12 (17) ◽  
pp. 2843
Author(s):  
Meijiao Zhong ◽  
Xinjian Shan ◽  
Xuemin Zhang ◽  
Chunyan Qu ◽  
Xiao Guo ◽  
...  

Taking the 2017 Mw6.5 Jiuzhaigou earthquake as a case study, ionospheric disturbances (i.e., total electron content and TEC) and thermal infrared (TIR) anomalies were simultaneously investigated. The characteristics of the temperature of brightness blackbody (TBB), medium-wave infrared brightness (MIB), and outgoing longwave radiation (OLR) were extracted and compared with the characteristics of ionospheric TEC. We observed different relationships among the three types of TIR radiation according to seismic or aseismic conditions. A wide range of positive TEC anomalies occurred southern to the epicenter. The area to the south of the Huarong mountain fracture, which contained the maximum TEC anomaly amplitudes, overlapped one of the regions with notable TIR anomalies. We observed three stages of increasing TIR radiation, with ionospheric TEC anomalies appearing after each stage, for the first time. There was also high spatial correspondence between both TIR and TEC anomalies and the regional geological structure. Together with the time series data, these results suggest that TEC anomaly genesis might be related to increasing TIR.


2013 ◽  
Vol 347-350 ◽  
pp. 3331-3335
Author(s):  
Qian Ru Wang ◽  
Xi Wei Chen ◽  
Da Shi Luo ◽  
Yu Feng Wei ◽  
Li Ya Jin ◽  
...  

Grey system theory has been widely used to forecast the economic data that are often highly nonlinear, irregular and non-stationary. Many models based on grey system theory could adapt to various economic time series data. However, some of these models didnt consider the impact of the model parameters, or only considered a simple change of the model parameters for the prediction. In this paper, we proposed the PSO based GM (1, 1) model using the optimized parameters in order to improve the forecasting accuracy. The experiment shows that PSO based GM (1, 1) gets much better forecasting accuracy compared with other widely used grey models on the actual chaotic economic data.


2007 ◽  
Vol 9 (1) ◽  
pp. 30-41 ◽  
Author(s):  
Nikhil S. Padhye ◽  
Sandra K. Hanneman

The application of cosinor models to long time series requires special attention. With increasing length of the time series, the presence of noise and drifts in rhythm parameters from cycle to cycle lead to rapid deterioration of cosinor models. The sensitivity of amplitude and model-fit to the data length is demonstrated for body temperature data from ambulatory menstrual cycling and menopausal women and from ambulatory male swine. It follows that amplitude comparisons between studies cannot be made independent of consideration of the data length. Cosinor analysis may be carried out on serial-sections of the series for improved model-fit and for tracking changes in rhythm parameters. Noise and drift reduction can also be achieved by folding the series onto a single cycle, which leads to substantial gains in the model-fit but lowers the amplitude. Central values of model parameters are negligibly changed by consideration of the autoregressive nature of residuals.


2018 ◽  
Vol 2018 ◽  
pp. 1-9 ◽  
Author(s):  
Amjad B. Khalil ◽  
Neelamegam Sivakumar ◽  
Muhammad Arslan ◽  
Hamna Saleem ◽  
Sami Qarawi

Brevibacillus borstelensis AK1 is a thermophile which grows between the temperatures of 45°C and 70°C. The present study is an extended genome report of B. borstelensis AK1 along with the morphological characterization. The strain is isolated from a hot spring in Saudi Arabia (southeast of the city Gazan). It is observed that the strain AK1 is rod-shaped, motile, and strictly aerobic bacterium. The whole genome sequence resulted in 29 contigs with a total length of 5,155,092 bp. In total, 3,946 protein-coding genes and 139 RNA genes were identified. Comparison with the previously submitted strains of B. borstelensis strains illustrates that strain AK1 has a small genome size but high GC content. The strain possesses putative genes for degradation of a wide range of substrates including polyethylene (plastic) and long-chain hydrocarbons. These genomic features may be useful for future environmental/biotechnological applications.


2019 ◽  
Vol 10 (3) ◽  
pp. 640 ◽  
Author(s):  
Abdinur Ali MOHAMED ◽  
Ahmed Ibrahim NAGEYE

The purpose of this study was to examine relationship between environmental degradation, resource scarcity, and civil conflict in Somalia. Environmental degradation is disposed to increase the number of disputes emerging from duel over the scarce resources. Consequently, it makes the society such offensive that it is inclined to armed conflict. In this study we investigated five variables in which civil conflict was the dependent variable. Population growth, land degradation, water resource and the climate change were explanatory variables. Time series data, 1990-2015, from various sources was employed. Regression methods, Ordinary Least Square was used to estimate the model parameters. Augmented Dickey-Fuller test was used to examine stationary of the data as Johansen cointegration was used to detect the long run relation between the study variables. The study found that one million increase of the rural population will lead the likelihood of the civil conflicts by about 1.04%. The decline of every one hector of arable land will cause the likelihood of the civil conflict to increase by about 1.5%. The rise of the one kilometer cubic of fresh water decreases the likelihood of the civil conflicts to about 4.49%. Rise of the temperature came to be insignificant and has no contribution to the civil conflicts in Somalia.


Sign in / Sign up

Export Citation Format

Share Document