scholarly journals Topological Analysis of Large-scale Biomedical Terminology Structures

2007 ◽  
Vol 14 (6) ◽  
pp. 788-797 ◽  
Author(s):  
M. E. Bales ◽  
Y. A. Lussier ◽  
S. B. Johnson
2010 ◽  
Vol 19 (01) ◽  
pp. 58-63 ◽  
Author(s):  
C. G. Chute

Summary Objective: Can social computing efforts materially alter the distributed creation and maintenance of complex biomedical terminologies and ontologies; a review of distributed authoring history and status. Background: Social computing projects, such as Wikipedia, have dramatically altered the perception and reality of large-scale content projects and the labor required to create and maintain them. Health terminologies have become large, complex, interdependent content artifacts of increasing importance to biomedical research and the communities understanding of biology, medicine, and optimal healthcare practices. The question naturally arises as to whether social computing models and distributed authoring platforms can be applied to the voluntary, distributed authoring of high-quality terminologies and ontologies. Methods: An historical review of distributed authoring developments. Results: The trajectory of description logic-driven authoring tools, group process, and web-based platforms suggests that public distributed authoring is likely feasible and practical; however, no compelling example on the order of Wikipedia is yet extant. Nevertheless, several projects, including the Gene Ontology and the new revision of the International Classification of Disease (ICD-11) hold promise.


2019 ◽  
Vol 4 (1) ◽  
Author(s):  
Antonio Maria Fiscarelli ◽  
Matthias R. Brust ◽  
Grégoire Danoy ◽  
Pascal Bouvry

Abstract The objective of a community detection algorithm is to group similar nodes that are more connected to each other than with the rest of the network. Several methods have been proposed but many are of high complexity and require global knowledge of the network, which makes them less suitable for large-scale networks. The Label Propagation Algorithm initially assigns a distinct label to each node that iteratively updates its label with the one of the majority of its neighbors, until consensus is reached among all nodes in the network. Nodes sharing the same label are then grouped into communities. It runs in near linear time and is decentralized, but it gets easily stuck in local optima and often returns a single giant community. To overcome these problems we propose MemLPA, a variation of the classical Label Propagation Algorithm where each node implements a memory mechanism that allows them to “remember” about past states of the network and uses a decision rule that takes this information into account. We demonstrate through extensive experiments, on the Lancichinetti-Fortunato-Radicchi benchmark and a set of real-world networks, that MemLPA outperforms other existing label propagation algorithms that implement memory and some of the well-known community detection algorithms. We also perform a topological analysis to extend the performance study and compare the topological properties of the communities found to the ground-truth community structure.


2017 ◽  
Author(s):  
Duygu Dikicioglu ◽  
Daniel J H Nightingale ◽  
Valerie Wood ◽  
Kathryn S Lilley ◽  
Stephen G Oliver

AbstractThe topological analyses of many large-scale molecular interaction networks often provide only limited insights into network function or evolution. In this paper, we argue that the functional heterogeneity of network components, rather than network size, is the main factor limiting the utility of topological analysis of large cellular networks. We have analysed large epistatic, functional, and transcriptional regulatory networks of genes that were attributed to the following biological process groupings: protein transactions, gene expression, cell cycle, and small molecule metabolism. Control analyses were performed on networks of randomly selected genes. We identified novel biological features emerging from the analysis of functionally homogenous biological networks irrespective of their size. In particular, direct regulation by transcription as an underrepresented feature of protein transactions. The analysis also demonstrated that the regulation of the genes involved in protein transactions at the transcriptional level was orchestrated by only a small number of regulators. Quantitative proteomic analysis of nuclear- and chromatin-enriched sub-cellular fractions of yeast provided supportive evidence for the conclusions generated by network analyses.


Water ◽  
2019 ◽  
Vol 11 (7) ◽  
pp. 1426 ◽  
Author(s):  
Zhiqiang Jiang ◽  
Chao Wang ◽  
Yi Liu ◽  
Zhongkai Feng ◽  
Changming Ji ◽  
...  

In order to allocate the raw water of the complex water supply system in Shenzhen reasonably, this paper studied the complex network relationship of this large-scale urban water supply system, which consists of 46 reservoirs, 67 waterworks, 2 external diversion water sources, 14 pumping stations and 9 gates, and described each component of the system with the concepts of point, line and plane. Using the topological analysis technology and graph theory, a generalized model of the network topological structure of the urban water allocation system was established. On this basis, combined with the water demand prediction and allocation model of waterworks, a water resources allocation model was established, aiming at satisfying the guaranteed rate of the water supply. The decomposition and coordination principle of the large-scale system and the dynamic simulation technology of the supply-demand balance were adopted to solve the model. The forward calculation mode of controlling waterworks and pumps, and the reverse calculation mode of controlling reservoirs and waterworks were designed in solving the model, and a double-layer feedback mechanism was formed, which took the reverse calculation mode as outer feedback and the reservoir water level constraint or pipeline capacity constraint as inner feedback. Through the verification calculation of the case study, it was found that the proposed model can deal well with the raw water allocation of a large-scale complex water supply system, which had an important application value and a practical significance.


Author(s):  
Shalva Marjanishvili

<p>Common engineering practice in multi-hazard design is to consider each natural hazard independently. The underlying assumption is that it is highly unlikely that one disaster will be closely followed by another. This approach dominated large part of the 20th century. The engineering community has made large strides in designing structures to withstand known hazards, leading to improved reliability and safety of infrastructure. This in turn has supported population growth and increased prosperity. As witness to our success, it is common in developed nations to consider it unacceptable for a disaster to cause large scale devastation. However, the nature of the disasters has proved otherwise.</p><p>It is unlikely that one extreme event will have catastrophic consequences on communities, because we know how to prepare for a single event. Instead, as experience shows, disasters are more typically comprised by one event followed by one or more other events, exposing the vulnerability of our design assumptions. The examples of multiple disasters are Indonesia (i.e., earthquake followed by tsunami followed by volcano), Haiti (i.e., earthquake followed by cholera outbreak) and Japan (i.e., earthquake followed by tsunami followed by nuclear meltdown). The obvious solution is to focus on understanding on the resilience of the system as an its ability to rapidly recover from the event.</p><p>This paper proposes a framework for quantitative measure and mathematically reproducible definitions of structural resilience as it pertains to a building’s ability to minimize the potential for undesirable consequences. The resilience assessment and design process follow logical progression of steps, starting with the characterization of hazards, continuing through analysis simulations, damage modelling, and loss assessment by finding and subsequently balancing functional relationships between design and analysis and consequences. The outcomes of each process are articulated through a series of generalized variables, termed as topology, geometry, damage and hazard intensity measures. Topological analysis methods are developed to map the effects of blast and extreme fire exposure so that the corresponding intensity measures can be addressed simultaneously during design</p>


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
William K. Chang ◽  
David VanInsberghe ◽  
Libusha Kelly

Abstract Microbiome dynamics influence the health and functioning of human physiology and the environment and are driven in part by interactions between large numbers of microbial taxa, making large-scale prediction and modeling a challenge. Here, using topological data analysis, we identify states and dynamical features relevant to macroscopic processes. We show that gut disease processes and marine geochemical events are associated with transitions between community states, defined as topological features of the data density. We find a reproducible two-state succession during recovery from cholera in the gut microbiomes of multiple patients, evidence of dynamic stability in the gut microbiome of a healthy human after experiencing diarrhea during travel, and periodic state transitions in a marine Prochlorococcus community driven by water column cycling. Our approach bridges small-scale fluctuations in microbiome composition and large-scale changes in phenotype without details of underlying mechanisms, and provides an assessment of microbiome stability and its relation to human and environmental health.


2018 ◽  
Author(s):  
Esther Ibanez-Marcelo ◽  
Lisa Campioni ◽  
Diego Manzoni ◽  
Enrica L Santarcangelo ◽  
Giovanni Petri

The aim of the study was to assess the EEG correlates of head positions, which have never been studied in humans, in participants with different psychophysiological characteristics, as encoded by their hypnotizability scores. This choice is motivated by earlier studies suggesting different processing of the vestibular/neck proprioceptive information in subjects with high (highs) and low (lows) hypnotizability scores maintaining their head rotated toward one side (RH). We analysed EEG signals recorded in 20 highs and 19 lows in basal conditions (head forward) and during RH, using spectral analysis, which captures changes localized to specific recording sites, and Topological Data Analysis (TDA), which instead describes large-scale differences in processing and representing sensorimotor information. Spectral analysis revealed significant differences related to the head position for alpha1, beta2, beta3, gamma bands, but not to hypnotizability. TDA instead revealed global hypnotizability-related differences in the strengths of the correlations among recording sites during RH. Significant changes were observed in lows on the left parieto-occipital side and in highs in right fronto-parietal region. Significant differences between the two groups were found in the occipital region, where changes were larger in lows than in highs. The study reports findings of the EEG correlates of the head posture for the first time, indicates that hypnotizability modulates its representation/processing on large-scale and that spectral and topological data analysis provide complementary results.


2014 ◽  
Vol 796 (2) ◽  
pp. 86 ◽  
Author(s):  
Prachi Parihar ◽  
Michael S. Vogeley ◽  
J. Richard Gott ◽  
Yun-Young Choi ◽  
Juhan Kim ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document