scholarly journals A scalable analytical approach from bacterial genomes to epidemiology

2021 ◽  
Author(s):  
Xavier Didelot ◽  
Julian Parkhill

Recent years have seen a remarkable increase in the practicality of sequencing whole genomes from large numbers of bacterial isolates. The availability of this data has huge potential to deliver new insights into the evolution and epidemiology of bacterial pathogens, but the scalability of the analytical methodology has been lagging behind that of the sequencing technology. Here we present a step-by-step approach for such large-scale genomic epidemiology analyses, from bacterial genomes to epidemiological interpretations. A central component of this approach is the dated phylogeny, which is a phylogenetic tree with branch lengths measured in units of time. The construction of dated phylogenies from bacterial genomic data needs to account for the disruptive effect of recombination on phylogenetic relationships, and we describe how this can be achieved. Dated phylogenies can then be used to perform fine-scale or large-scale epidemiological analyses, depending on the proportion of cases for which genomes are available. A key feature of this approach is computational scalability, and in particular the ability to process hundreds or thousands of genomes within a matter of hours. This is a clear advantage of the step-by-step approach described here. We discuss other advantages and disadvantages of the approach, as well as potential improvements and avenues for future research.

Author(s):  
Paul Giguere ◽  
Scott W. Formica ◽  
Wayne M. Harding ◽  
Michele R. Cummins

Designing online trainings or courses for large numbers of participants can prove to be challenging for instructors and facilitators. Online learning environments need to be structured in a way that preserves actual or perceived levels of interaction, participant perceptions of value and utility, and achievement of the learning objectives. This chapter describes five Large-Scale Interaction Strategies that offer guidance for addressing some of these online instructional design issues. Evaluation data are presented in support of two of the strategies, and recommendations are provided about how future research in this area might be conducted.


2011 ◽  
Vol 2 (2) ◽  
pp. 197-204 ◽  
Author(s):  
M. Rösner ◽  
R. Lammering

Abstract. Model order reduction appears to be beneficial for the synthesis and simulation of compliant mechanisms due to computational costs. Model order reduction is an established method in many technical fields for the approximation of large-scale linear time-invariant dynamical systems described by ordinary differential equations. Based on system theory, underlying representations of the dynamical system are introduced from which the general reduced order model is derived by projection. During the last years, numerous new procedures were published and investigated appropriate to simulation, optimization and control. Singular value decomposition, condensation-based and Krylov subspace methods representing three order reduction methods are reviewed and their advantages and disadvantages are outlined in this paper. The convenience of applying model order reduction in compliant mechanisms is quoted. Moreover, the requested attributes for order reduction as a future research direction meeting the characteristics of compliant mechanisms are commented.


Electronics ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 750
Author(s):  
Yuanfei Dai ◽  
Shiping Wang ◽  
Neal N. Xiong ◽  
Wenzhong Guo

A knowledge graph (KG), also known as a knowledge base, is a particular kind of network structure in which the node indicates entity and the edge represent relation. However, with the explosion of network volume, the problem of data sparsity that causes large-scale KG systems to calculate and manage difficultly has become more significant. For alleviating the issue, knowledge graph embedding is proposed to embed entities and relations in a KG to a low-, dense and continuous feature space, and endow the yield model with abilities of knowledge inference and fusion. In recent years, many researchers have poured much attention in this approach, and we will systematically introduce the existing state-of-the-art approaches and a variety of applications that benefit from these methods in this paper. In addition, we discuss future prospects for the development of techniques and application trends. Specifically, we first introduce the embedding models that only leverage the information of observed triplets in the KG. We illustrate the overall framework and specific idea and compare the advantages and disadvantages of such approaches. Next, we introduce the advanced models that utilize additional semantic information to improve the performance of the original methods. We divide the additional information into two categories, including textual descriptions and relation paths. The extension approaches in each category are described, following the same classification criteria as those defined for the triplet fact-based models. We then describe two experiments for comparing the performance of listed methods and mention some broader domain tasks such as question answering, recommender systems, and so forth. Finally, we collect several hurdles that need to be overcome and provide a few future research directions for knowledge graph embedding.


2015 ◽  
Vol 727-728 ◽  
pp. 976-981
Author(s):  
Hua Fen Xu ◽  
Jing Wu ◽  
Guo Jun Mao

With advances in data collection and generation technologies, environments that produce data streams is more and more. In recent years, the network application is further universal and the applications of a single data stream transfer toward a multi-node distributed data streams, such as sensor network, network monitoring, web log analysis and the credit card transaction data of multiple sites. These data is not only real-time, continuous and large scale, but also distributed. How to manage and analyze large dynamic datasets is an important subject that researchers are faced with. In view of the situation, it presented the formalization description of homogeneous and heterogeneous distributed data stream in this paper, analyzed advantages and disadvantages of the centralized stream processing architecture and distributed streaming processing architecture, discussed the recent progress in distributed data stream classification algorithm, summed up the problems and challenges faced by the distributed data stream mining, and possible future research directions.


Author(s):  
John E. Abraham ◽  
John Douglas Hunt

Large-scale urban models often are subdivided into simpler submodels. The parameters of these models can be estimated using approaches that differ in regard to whether the full modeling system is run during an estimation procedure or whether that overall estimation is performed simultaneously with the estimation of the individual submodels. There are also ways in which extra data or extra models can be used to further inform parameter values. Five different techniques are presented (“limited view,” “piecewise” “simultaneous,” “sequential,” and “Bayesian sequential”), and the statistical theory necessary to justify each technique concurrently is described. The practical advantages and disadvantages are discussed, and each technique is illustrated using a simple nested logit model example. The concepts then are further illustrated by describing the sequential parameter estimation process for a land use/transport interaction model of the Sacramento, California, region. The ideas and examples should help modelers place more of an emphasis on overall calibration, allow them to follow a more rigorous approach in establishing the parameters of large-scale urban models, and help them understand the theory and assumptions that they are implicitly adopting. Two techniques in particular are noted as worthy of future research in large-scale urban modeling: ( a) establishing the likelihood function based directly on the structural equations of the model, eliminating or reducing the need to “solve” for the model outputs during parameter estimation; and ( b) using Bayesian techniques to adjust parameters in an overall estimation without discarding what is already known about those parameters.


2014 ◽  
Vol 16 (3) ◽  
pp. 561-577 ◽  

<p>It is well known that chlorine and its compounds, traditionally utilized for water and wastewater disinfection, react with some organic matter to form undesirable by-products, hazardous to human health, known as Disinfection By-Products (DBPs). In many countries very stringent limits for chlorination by-products such as trihalomethanes were set for wastewater reuse. Accordingly, the use of different oxidation/disinfection systems should be evaluated as possible alternative to chlorine. Ultrasound (US) was recently found to be effective for this purpose.</p> <p>Aim of this work is to review main US disinfection studies, pointing out ultrasound mechanisms as well as its effects in terms of different bacteria inactivation (<em>Total coliform, Escherichia coli, Pseudomonas aeruginosa, Bacillus subtilis, Saccharomyces cerevisiae, Klebsiella pneumonia</em>) at both laboratory scale and pilot-scale. To this end, several experimental results were discussed and both focal interest points and encountered problems were summarized.</p> <div> <p>Moreover the intensification of cavitation phenomena by combined oxidation processes was overviewed and main advantages and disadvantages were pointed out, in order to address future research and promote efficient large scale operations.&nbsp;</p> </div> <p>&nbsp;</p>


2021 ◽  
Author(s):  
Jianxin Wang ◽  
Craig Poskanzer ◽  
Stefano Anzellotti

Facial expressions are critical in our daily interactions. Studying how humans recognize dynamic facial expressions is an important area of research in social perception, but advancements are hampered by the difficulty of creating well-controlled stimuli. Research on the perception of static faces has made significant progress thanks to techniques that make it possible to generate synthetic face stimuli. However, synthetic dynamic expressions are more difficult to generate; methods that yield realistic dynamics typically rely on the use of infrared markers applied on the face, making it expensive to create datasets that include large numbers of different expressions. In addition, the use of markers might interfere with facial dynamics. In this paper, we contribute a new method to generate large amounts of realistic and well-controlled facial expression videos. We use a deep convolutional neural network with attention and asymmetric loss to extract the dynamics of action units from videos, and demonstrate that this approach outperforms a baseline model based on convolutional neural networks without attention on the same stimuli. Next, we develop a pipeline to use the action unit dynamics to render realistic synthetic videos. This pipeline makes it possible to generate large scale naturalistic and controllable facial expression datasets to facilitate future research in social cognitive science.


Sensors ◽  
2020 ◽  
Vol 20 (18) ◽  
pp. 5073
Author(s):  
Khalil Khan ◽  
Waleed Albattah ◽  
Rehan Ullah Khan ◽  
Ali Mustafa Qamar ◽  
Durre Nayab

Real time crowd analysis represents an active area of research within the computer vision community in general and scene analysis in particular. Over the last 10 years, various methods for crowd management in real time scenario have received immense attention due to large scale applications in people counting, public events management, disaster management, safety monitoring an so on. Although many sophisticated algorithms have been developed to address the task; crowd management in real time conditions is still a challenging problem being completely solved, particularly in wild and unconstrained conditions. In the proposed paper, we present a detailed review of crowd analysis and management, focusing on state-of-the-art methods for both controlled and unconstrained conditions. The paper illustrates both the advantages and disadvantages of state-of-the-art methods. The methods presented comprise the seminal research works on crowd management, and monitoring and then culminating state-of-the-art methods of the newly introduced deep learning methods. Comparison of the previous methods is presented, with a detailed discussion of the direction for future research work. We believe this review article will contribute to various application domains and will also augment the knowledge of the crowd analysis within the research community.


1967 ◽  
Vol 06 (01) ◽  
pp. 8-14 ◽  
Author(s):  
M. F. Collen

The utilization of an automated multitest laboratory as a data acquisition center and of a computer for trie data processing and analysis permits large scale preventive medical research previously not feasible. Normal test values are easily generated for the particular population studied. Long-term epidemiological research on large numbers of persons becomes practical. It is our belief that the advent of automation and computers has introduced a new era of preventive medicine.


2017 ◽  
Vol 5 (1) ◽  
pp. 70-82
Author(s):  
Soumi Paul ◽  
Paola Peretti ◽  
Saroj Kumar Datta

Building customer relationships and customer equity is the prime concern in today’s business decisions. The emergence of internet, especially social media like Facebook and Twitter, changed traditional marketing thought to a great extent. The importance of customer orientation is reflected in the axiom, “The customer is the king”. A good number of organizations are engaging customers in their new product development activities via social media platforms. Co-creation, a new perspective in which customers are active co-creators of the products they buy and use, is currently challenging the traditional paradigm. The concept of co-creation involving the customer’s knowledge, creativity and judgment to generate value is considered not only an upcoming trend that introduces new products or services but also fitting their need and increasing value for money. Knowledge and innovation are inseparable. Knowledge management competencies and capacities are essential to any organization that aspires to be distinguished and innovative. The present work is an attempt to identify the change in value creation procedure along with one area of business, where co-creation can return significant dividends. It is on extending the brand or brand category through brand extension or line extension. This article, through an in depth literature review analysis, identifies the changes in every perspective of this paradigm shift and it presents a conceptual model of company-customer-brand-based co-creation activity via social media. The main objective is offering an agenda for future research of this emerging trend and ensuring the way to move from theory to practice. The paper acts as a proposal; it allows the organization to go for this change in a large scale and obtain early feedback on the idea presented. 


Sign in / Sign up

Export Citation Format

Share Document