scholarly journals PhosPiR: An automated phospho-proteomic pipeline in R

2021 ◽  
Author(s):  
Ye Hong ◽  
Dani Flinkman ◽  
Tomi Suomi ◽  
Sami Pietilä ◽  
Peter James ◽  
...  

ABSTRACTLarge-scale phospho-proteome profiling using mass spectrometry (MS) provides functional insight that is crucial for disease biology and drug discovery. However, extracting biological understanding from this data is an arduous task requiring multiple analysis platforms that are not adapted for automated high-dimensional data analysis. Here, we introduce an integrated pipeline that combines several R packages to extract high-level biological understanding from largescale phosphoproteomic data by seamless integration with existing databases and knowledge resources. In a single run, PhosPiR provides data clean-up, fast data overview, multiple statistical testing, differential expression analysis, phospho-site annotation and translation across species, multi-level enrichment analyses, proteome-wide kinase activity and substrate mapping and network hub analysis. Data output includes graphical formats such as heatmap, box-, volcano- and circos-plots. This resource is designed to assist proteome-wide data mining of pathophysiological mechanism without a need for programming knowledge.

2017 ◽  
Author(s):  
Jordan K Matelsky ◽  
Joseph Downs ◽  
Hannah Cowley ◽  
Brock Wester ◽  
William Gray-Roncal

AbstractAs the scope of scientific questions increase and datasets grow larger, the visualization of relevant information correspondingly becomes more difficult and complex. Sharing visualizations amongst collaborators and with the public can be especially onerous, as it is challenging to reconcile software dependencies, data formats, and specific user needs in an easily accessible package. We presentsubstrate, a data-visualization framework designed to simplify communication and code reuse across diverse research teams. Our platform provides a simple, powerful, browser-based interface for scientists to rapidly build effective three-dimensional scenes and visualizations. We aim to reduce the gap of existing systems, which commonly prescribe a limited set of high-level components, that are rarely optimized for arbitrarily large data visualization or for custom data types. To further engage the broader scientific community and enable seamless integration with existing scientific workflows, we also presentpytri, a Python library that bridges the use ofsubstratewith the ubiquitous scientific computing platform,Jupyter. Our intention is to reduce the activation energy required to transition between exploratory data analysis, data visualization, and publication-quality interactive scenes.


2011 ◽  
Vol 23 (10) ◽  
pp. 2457-2497 ◽  
Author(s):  
Emre Neftci ◽  
Elisabetta Chicca ◽  
Giacomo Indiveri ◽  
Rodney Douglas

An increasing number of research groups are developing custom hybrid analog/digital very large scale integration (VLSI) chips and systems that implement hundreds to thousands of spiking neurons with biophysically realistic dynamics, with the intention of emulating brainlike real-world behavior in hardware and robotic systems rather than simply simulating their performance on general-purpose digital computers. Although the electronic engineering aspects of these emulation systems is proceeding well, progress toward the actual emulation of brainlike tasks is restricted by the lack of suitable high-level configuration methods of the kind that have already been developed over many decades for simulations on general-purpose computers. The key difficulty is that the dynamics of the CMOS electronic analogs are determined by transistor biases that do not map simply to the parameter types and values used in typical abstract mathematical models of neurons and their networks. Here we provide a general method for resolving this difficulty. We describe a parameter mapping technique that permits an automatic configuration of VLSI neural networks so that their electronic emulation conforms to a higher-level neuronal simulation. We show that the neurons configured by our method exhibit spike timing statistics and temporal dynamics that are the same as those observed in the software simulated neurons and, in particular, that the key parameters of recurrent VLSI neural networks (e.g., implementing soft winner-take-all) can be precisely tuned. The proposed method permits a seamless integration between software simulations with hardware emulations and intertranslatability between the parameters of abstract neuronal models and their emulation counterparts. Most important, our method offers a route toward a high-level task configuration language for neuromorphic VLSI systems.


Author(s):  
Georgi Derluguian

The author develops ideas about the origin of social inequality during the evolution of human societies and reflects on the possibilities of its overcoming. What makes human beings different from other primates is a high level of egalitarianism and altruism, which contributed to more successful adaptability of human collectives at early stages of the development of society. The transition to agriculture, coupled with substantially increasing population density, was marked by the emergence and institutionalisation of social inequality based on the inequality of tangible assets and symbolic wealth. Then, new institutions of warfare came into existence, and they were aimed at conquering and enslaving the neighbours engaged in productive labour. While exercising control over nature, people also established and strengthened their power over other people. Chiefdom as a new type of polity came into being. Elementary forms of power (political, economic and ideological) served as a basis for the formation of early states. The societies in those states were characterised by social inequality and cruelties, including slavery, mass violence and numerous victims. Nowadays, the old elementary forms of power that are inherent in personalistic chiefdom are still functioning along with modern institutions of public and private bureaucracy. This constitutes the key contradiction of our time, which is the juxtaposition of individual despotic power and public infrastructural one. However, society is evolving towards an ever more efficient combination of social initiatives with the sustainability and viability of large-scale organisations.


Genetics ◽  
2001 ◽  
Vol 159 (4) ◽  
pp. 1765-1778
Author(s):  
Gregory J Budziszewski ◽  
Sharon Potter Lewis ◽  
Lyn Wegrich Glover ◽  
Jennifer Reineke ◽  
Gary Jones ◽  
...  

Abstract We have undertaken a large-scale genetic screen to identify genes with a seedling-lethal mutant phenotype. From screening ~38,000 insertional mutant lines, we identified >500 seedling-lethal mutants, completed cosegregation analysis of the insertion and the lethal phenotype for >200 mutants, molecularly characterized 54 mutants, and provided a detailed description for 22 of them. Most of the seedling-lethal mutants seem to affect chloroplast function because they display altered pigmentation and affect genes encoding proteins predicted to have chloroplast localization. Although a high level of functional redundancy in Arabidopsis might be expected because 65% of genes are members of gene families, we found that 41% of the essential genes found in this study are members of Arabidopsis gene families. In addition, we isolated several interesting classes of mutants and genes. We found three mutants in the recently discovered nonmevalonate isoprenoid biosynthetic pathway and mutants disrupting genes similar to Tic40 and tatC, which are likely to be involved in chloroplast protein translocation. Finally, we directly compared T-DNA and Ac/Ds transposon mutagenesis methods in Arabidopsis on a genome scale. In each population, we found only about one-third of the insertion mutations cosegregated with a mutant phenotype.


1979 ◽  
Vol 6 (2) ◽  
pp. 70-72
Author(s):  
T. A. Coffelt ◽  
F. S. Wright ◽  
J. L. Steele

Abstract A new method of harvesting and curing breeder's seed peanuts in Virginia was initiated that would 1) reduce the labor requirements, 2) maintain a high level of germination, 3) maintain varietal purity at 100%, and 4) reduce the risk of frost damage. Three possible harvesting and curing methods were studied. The traditional stack-pole method satisfied the latter 3 objectives, but not the first. The windrow-combine method satisfied the first 2 objectives, but not the last 2. The direct harvesting method satisfied all four objectives. The experimental equipment and curing procedures for direct harvesting had been developed but not tested on a large scale for seed harvesting. This method has been used in Virginia to produce breeder's seed of 3 peanut varieties (Florigiant, VA 72R and VA 61R) during five years. Compared to the stackpole method, labor requirements have been reduced, satisfactory levels of germination and varietal purity have been obtained, and the risk of frost damage has been minimized.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Yi Chen ◽  
Fons. J. Verbeek ◽  
Katherine Wolstencroft

Abstract Background The hallmarks of cancer provide a highly cited and well-used conceptual framework for describing the processes involved in cancer cell development and tumourigenesis. However, methods for translating these high-level concepts into data-level associations between hallmarks and genes (for high throughput analysis), vary widely between studies. The examination of different strategies to associate and map cancer hallmarks reveals significant differences, but also consensus. Results Here we present the results of a comparative analysis of cancer hallmark mapping strategies, based on Gene Ontology and biological pathway annotation, from different studies. By analysing the semantic similarity between annotations, and the resulting gene set overlap, we identify emerging consensus knowledge. In addition, we analyse the differences between hallmark and gene set associations using Weighted Gene Co-expression Network Analysis and enrichment analysis. Conclusions Reaching a community-wide consensus on how to identify cancer hallmark activity from research data would enable more systematic data integration and comparison between studies. These results highlight the current state of the consensus and offer a starting point for further convergence. In addition, we show how a lack of consensus can lead to large differences in the biological interpretation of downstream analyses and discuss the challenges of annotating changing and accumulating biological data, using intermediate knowledge resources that are also changing over time.


2012 ◽  
Vol 33 (07) ◽  
pp. 649-656 ◽  
Author(s):  
Mark Holodniy ◽  
Gina Oda ◽  
Patricia L. Schirmer ◽  
Cynthia A. Lucero ◽  
Yury E. Khudyakov ◽  
...  

Objective.To determine whether improper high-level disinfection practices during endoscopy procedures resulted in bloodborne viral infection transmission.Design.Retrospective cohort study.Setting.Four Veterans Affairs medical centers (VAMCs).Patients.Veterans who underwent colonoscopy and laryngoscopy (ear, nose, and throat [ENT]) procedures from 2003 to 2009.Methods.Patients were identified through electronic health record searches and serotested for human immunodeficiency virus (HIV), hepatitis C virus (HCV), and hepatitis B virus (HBV). Newly discovered case patients were linked to a potential source with known identical infection, whose procedure occurred no more than 1 day prior to the case patient's procedure. Viral genetic testing was performed for case/proximate pairs to determine relatedness.Results.Of 10,737 veterans who underwent endoscopy at 4 VAMCs, 9,879 patients agreed to viral testing. Of these, 90 patients were newly diagnosed with 1 or more viral bloodborne pathogens (BBPs). There were no case/proximate pairings found for patients with either HIV or HBV; 24 HCV case/proximate pairings were found, of which 7 case patients and 8 proximate patients had sufficient viral load for further genetic testing. Only 2 of these cases, both of whom underwent laryngoscopy, and their 4 proximates agreed to further testing. None of the 4 remaining proximate patients who underwent colonoscopy agreed to further testing. Mean genetic distance between the 2 case patients and 4 proximate patients ranged from 13.5% to 19.1%.Conclusions.Our investigation revealed that exposure to improperly reprocessed ENT endoscopes did not result in viral transmission in those patients who had viral genetic analysis performed. Any potential transmission of BBPs from colonoscopy remains unknown.


2021 ◽  
Vol 12 (4) ◽  
Author(s):  
Peng Chen ◽  
Hongyang Jing ◽  
Mingtao Xiong ◽  
Qian Zhang ◽  
Dong Lin ◽  
...  

AbstractThe genes encoding for neuregulin1 (NRG1), a growth factor, and its receptor ErbB4 are both risk factors of major depression disorder and schizophrenia (SZ). They have been implicated in neural development and synaptic plasticity. However, exactly how NRG1 variations lead to SZ remains unclear. Indeed, NRG1 levels are increased in postmortem brain tissues of patients with brain disorders. Here, we studied the effects of high-level NRG1 on dendritic spine development and function. We showed that spine density in the prefrontal cortex and hippocampus was reduced in mice (ctoNrg1) that overexpressed NRG1 in neurons. The frequency of miniature excitatory postsynaptic currents (mEPSCs) was reduced in both brain regions of ctoNrg1 mice. High expression of NRG1 activated LIMK1 and increased cofilin phosphorylation in postsynaptic densities. Spine reduction was attenuated by inhibiting LIMK1 or blocking the NRG1–LIMK1 interaction, or by restoring NRG1 protein level. These results indicate that a normal NRG1 protein level is necessary for spine homeostasis and suggest a pathophysiological mechanism of abnormal spines in relevant brain disorders.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Mohamed A. Farag ◽  
Moamen M. Elmassry ◽  
Masahiro Baba ◽  
Renée Friedman

Abstract Previous studies have shown that the Ancient Egyptians used malted wheat and barley as the main ingredients in beer brewing, but the chemical determination of the exact recipe is still lacking. To investigate the constituents of ancient beer, we conducted a detailed IR and GC-MS based metabolite analyses targeting volatile and non-volatile metabolites on the residues recovered from the interior of vats in what is currently the world’s oldest (c. 3600 BCE) installation for large-scale beer production located at the major pre-pharaonic political center at Hierakonpolis, Egypt. In addition to distinguishing the chemical signatures of various flavoring agents, such as dates, a significant result of our analysis is the finding, for the first time, of phosphoric acid in high level probably used as a preservative much like in modern beverages. This suggests that the early brewers had acquired the knowledge needed to efficiently produce and preserve large quantities of beer. This study provides the most detailed chemical profile of an ancient beer using modern spectrometric techniques and providing evidence for the likely starting materials used in beer brewing.


Author(s):  
Lucas Meyer de Freitas ◽  
Oliver Schuemperlin ◽  
Milos Balac ◽  
Francesco Ciari

This paper shows an application of the multiagent, activity-based transport simulation MATSim to evaluate equity effects of a congestion charging scheme. A cordon pricing scheme was set up for a scenario of the city of Zurich, Switzerland, to conduct such an analysis. Equity is one of the most important barriers toward the implementation of a congestion charging system. After the challenges posed by equity evaluations are examined, it is shown that agent-based simulations with heterogeneous values of time allow for an increased level of detail in such evaluations. Such detail is achieved through a high level of disaggregation and with a 24-h simulation period. An important difference from traditional large-scale models is the low degree of correlation between travel time savings and welfare change. While traditional equity analysis is based on travel time savings, MATSim shows that choice dimensions not included in traditional models, such as departure time changes, can also play an important role in equity effects. The analysis of the results in light of evidence from the literature shows that agent-based models are a promising tool to conduct more complete equity evaluations not only of congestion charges but also of transport policies in general.


Sign in / Sign up

Export Citation Format

Share Document