Emerging concepts in pseudoenzyme classification, evolution, and signaling

2019 ◽  
Vol 12 (594) ◽  
pp. eaat9797 ◽  
Author(s):  
António J. M. Ribeiro ◽  
Sayoni Das ◽  
Natalie Dawson ◽  
Rossana Zaru ◽  
Sandra Orchard ◽  
...  

The 21st century is witnessing an explosive surge in our understanding of pseudoenzyme-driven regulatory mechanisms in biology. Pseudoenzymes are proteins that have sequence homology with enzyme families but that are proven or predicted to lack enzyme activity due to mutations in otherwise conserved catalytic amino acids. The best-studied pseudoenzymes are pseudokinases, although examples from other families are emerging at a rapid rate as experimental approaches catch up with an avalanche of freely available informatics data. Kingdom-wide analysis in prokaryotes, archaea and eukaryotes reveals that between 5 and 10% of proteins that make up enzyme families are pseudoenzymes, with notable expansions and contractions seemingly associated with specific signaling niches. Pseudoenzymes can allosterically activate canonical enzymes, act as scaffolds to control assembly of signaling complexes and their localization, serve as molecular switches, or regulate signaling networks through substrate or enzyme sequestration. Molecular analysis of pseudoenzymes is rapidly advancing knowledge of how they perform noncatalytic functions and is enabling the discovery of unexpected, and previously unappreciated, functions of their intensively studied enzyme counterparts. Notably, upon further examination, some pseudoenzymes have previously unknown enzymatic activities that could not have been predicted a priori. Pseudoenzymes can be targeted and manipulated by small molecules and therefore represent new therapeutic targets (or anti-targets, where intervention should be avoided) in various diseases. In this review, which brings together broad bioinformatics and cell signaling approaches in the field, we highlight a selection of findings relevant to a contemporary understanding of pseudoenzyme-based biology.

Author(s):  
Maria A. Milkova

Nowadays the process of information accumulation is so rapid that the concept of the usual iterative search requires revision. Being in the world of oversaturated information in order to comprehensively cover and analyze the problem under study, it is necessary to make high demands on the search methods. An innovative approach to search should flexibly take into account the large amount of already accumulated knowledge and a priori requirements for results. The results, in turn, should immediately provide a roadmap of the direction being studied with the possibility of as much detail as possible. The approach to search based on topic modeling, the so-called topic search, allows you to take into account all these requirements and thereby streamline the nature of working with information, increase the efficiency of knowledge production, avoid cognitive biases in the perception of information, which is important both on micro and macro level. In order to demonstrate an example of applying topic search, the article considers the task of analyzing an import substitution program based on patent data. The program includes plans for 22 industries and contains more than 1,500 products and technologies for the proposed import substitution. The use of patent search based on topic modeling allows to search immediately by the blocks of a priori information – terms of industrial plans for import substitution and at the output get a selection of relevant documents for each of the industries. This approach allows not only to provide a comprehensive picture of the effectiveness of the program as a whole, but also to visually obtain more detailed information about which groups of products and technologies have been patented.


Author(s):  
Laure Fournier ◽  
Lena Costaridou ◽  
Luc Bidaut ◽  
Nicolas Michoux ◽  
Frederic E. Lecouvet ◽  
...  

Abstract Existing quantitative imaging biomarkers (QIBs) are associated with known biological tissue characteristics and follow a well-understood path of technical, biological and clinical validation before incorporation into clinical trials. In radiomics, novel data-driven processes extract numerous visually imperceptible statistical features from the imaging data with no a priori assumptions on their correlation with biological processes. The selection of relevant features (radiomic signature) and incorporation into clinical trials therefore requires additional considerations to ensure meaningful imaging endpoints. Also, the number of radiomic features tested means that power calculations would result in sample sizes impossible to achieve within clinical trials. This article examines how the process of standardising and validating data-driven imaging biomarkers differs from those based on biological associations. Radiomic signatures are best developed initially on datasets that represent diversity of acquisition protocols as well as diversity of disease and of normal findings, rather than within clinical trials with standardised and optimised protocols as this would risk the selection of radiomic features being linked to the imaging process rather than the pathology. Normalisation through discretisation and feature harmonisation are essential pre-processing steps. Biological correlation may be performed after the technical and clinical validity of a radiomic signature is established, but is not mandatory. Feature selection may be part of discovery within a radiomics-specific trial or represent exploratory endpoints within an established trial; a previously validated radiomic signature may even be used as a primary/secondary endpoint, particularly if associations are demonstrated with specific biological processes and pathways being targeted within clinical trials. Key Points • Data-driven processes like radiomics risk false discoveries due to high-dimensionality of the dataset compared to sample size, making adequate diversity of the data, cross-validation and external validation essential to mitigate the risks of spurious associations and overfitting. • Use of radiomic signatures within clinical trials requires multistep standardisation of image acquisition, image analysis and data mining processes. • Biological correlation may be established after clinical validation but is not mandatory.


2021 ◽  
Vol 22 (9) ◽  
pp. 4728
Author(s):  
Tanuza Das ◽  
Eun Joo Song ◽  
Eunice EunKyeong Kim

Ubiquitination and deubiquitination are protein post-translational modification processes that have been recognized as crucial mediators of many complex cellular networks, including maintaining ubiquitin homeostasis, controlling protein stability, and regulating several signaling pathways. Therefore, some of the enzymes involved in ubiquitination and deubiquitination, particularly E3 ligases and deubiquitinases, have attracted attention for drug discovery. Here, we review recent findings on USP15, one of the deubiquitinases, which regulates diverse signaling pathways by deubiquitinating vital target proteins. Even though several basic previous studies have uncovered the versatile roles of USP15 in different signaling networks, those have not yet been systematically and specifically reviewed, which can provide important information about possible disease markers and clinical applications. This review will provide a comprehensive overview of our current understanding of the regulatory mechanisms of USP15 on different signaling pathways for which dynamic reverse ubiquitination is a key regulator.


2019 ◽  
Vol 11 (23) ◽  
pp. 2952-2959 ◽  
Author(s):  
Jessica Pandohee ◽  
Robert J. Rees ◽  
Michelle J. S. Spencer ◽  
Aaron Raynor ◽  
Oliver A. H. Jones

This paper outlines a protocol, which combines quantum mechanics calculations and experimental synthesis, to enable systematic selection of suitable chromophores based on their stability of fluorescence and efficiency of the chemical reaction.


PPAR Research ◽  
2009 ◽  
Vol 2009 ◽  
pp. 1-9 ◽  
Author(s):  
Lap Shu Alan Chan ◽  
Richard A. Wells

The PPARs are integral parts of the RXR-dependent signaling networks. Many other nuclear receptor subfamily 1 members also require RXR as their obligatory heterodimerization partner and they are often co-expressed in any given tissue. Therefore, the PPARs often complete with other RXR-dependent nuclear receptors and this competition has important biological implications. Thorough understanding of this cross-talk at the molecular level is crucial to determine the detailed functional roles of the PPARs. At the level of DNA binding, most RXR heterodimers bind selectively to the well-known “DR1 to 5” DNA response elements. As a result, many heterodimers share the same DR element and must complete with each other for DNA binding. At the level of heterodimerization, the partners of RXR share the same RXR dimerization interface. As a result, individual nuclear receptors must complete with each other for RXR to form functional heterodimers. Cross-talk through DNA binding and RXR heterodimerization present challenges to the study of these nuclear receptors that cannot be adequately addressed by current experimental approaches. Novel tools, such as engineered nuclear receptors with altered dimerization properties, are currently being developed. These tools will enable future studies to dissect specific RXR heterodimers and their signaling pathways.


2021 ◽  
Vol 13 (22) ◽  
pp. 4509
Author(s):  
Gaspare Galati ◽  
Gabriele Pavan ◽  
Kubilay Savci ◽  
Christoph Wasserzier

In defense applications, the main features of radars are the Low Probability of Intercept (LPI) and the Low Probability of Exploitation (LPE). The counterpart uses more and more capable intercept receivers and signal processors thanks to the ongoing technological progress. Noise Radar Technology (NRT) is probably a very effective answer to the increasing demand for operational LPI/LPE radars. The design and selection of the radiated waveforms, while respecting the prescribed spectrum occupancy, has to comply with the contrasting requirements of LPI/LPE and of a favorable shape of the ambiguity function. Information theory seems to be a “technologically agnostic” tool to attempt to quantify the LPI/LPE capability of noise waveforms with little, or absent, a priori knowledge of the means and the strategies used by the counterpart. An information theoretical analysis can lead to practical results in the design and selection of NRT waveforms.


Sign in / Sign up

Export Citation Format

Share Document