scholarly journals Amino acid stylization: a new approach for representing protein sequences and scoring protein alignments

2018 ◽  
Author(s):  
Stefano M. Marino

SynopsisThe amino acids (AA) stylization, introduced in this work, is a mean to obtain a qualitative and quantitative description of different AA; the goal is to allow, via a chemically meaningful representation, the analysis of protein similarity in multiple alignments, without any statistical and evolutionary information involved. The AA Stylization Matrix Method (aSMM) has been tested positively with different data sets, in line with most common statistically based approaches (e.g. BLOSUM62), while allowing the exploitation of some unique capabilities, e.g. to be rapidly configured for specific properties that can be (over-, or under-) weighted. Overall, applying ad hoc parameterization aSMM permits a tailored investigation to account for different types of substitutions in protein alignments.

2004 ◽  
Vol 37 (2) ◽  
pp. 231-242 ◽  
Author(s):  
Christopher J. Gilmore ◽  
Gordon Barr ◽  
Jonathan Paisley

A new integrated approach to full powder diffraction pattern analysis is described. This new approach incorporates wavelet-based data pre-processing, non-parametric statistical tests for full-pattern matching, and singular value decomposition to extract quantitative phase information from mixtures. Every measured data point is used in both qualitative and quantitative analyses. The success of this new integrated approach is demonstrated through examples using several test data sets. The methods are incorporated within the commercial software programSNAP-1D, and can be extended to high-throughput powder diffraction experiments.


Author(s):  
Elena Lytvynenko ◽  
◽  
Taisiya Kozlova ◽  

The changeable and unpredictable development of the enterprises’ external environment is one of the appearance causes of various types of business activities' risks, including logistics. The purpose of this article is to develop recommendations on improving the risk management of enterprises’ logistics activities in the context of instability. Achieving this goal requires consideration of the main stages of this process regarding the logistics activities' risks, providing advices on improving the process of risk management of logistics orientation. The article explores the process of analyzing the logistics activities' risks of the enterprise. Proceeding from the theoretical provisions of management and summarizing the practical experience of research in the field of systematic analysis of the enterprises' logistics activities risks, there are traced the organization's peculiarities of such analysis, and the main directions of its further improvement are proposed. All actions in the article, which are related to the analysis of the risk of enterprise logistics activity, are proposed to carry out in a certain sequence in the article. This sequence is given in the form of a structural scheme of systematic analysis of the risks of the enterprise logistics activities. Based on the objectivity of the existence of logistics activities' risks and the need to ensure the rational management of them, the algorithm of the risk management in the enterprise logistics system covers the stages of risks' identification, their qualitative and quantitative assessment, diagnostics, assessment of risk acceptability and application of neutralization measures to unacceptable logistical risks. It is concluded that the logistics activities risks combine different types of risks of all components and elements both in the process of changing material, financial and information flows, as well as in the process of managing the risks arising in the logistics system


2018 ◽  
Author(s):  
Abdallah Dabboussi ◽  
Jaafar Gaber ◽  
Maxime Wack ◽  
Raed Kouta ◽  
Bachar EL Hassan ◽  
...  

2014 ◽  
Vol 28 (2) ◽  
pp. 313-330 ◽  
Author(s):  
R. David Plumlee ◽  
Philip M. J. Reckers

SYNOPSIS: In 2005, an ad hoc committee appointed by the American Accounting Association (AAA) documented a crisis-level shortage of accounting Ph.D.s and recommended significant structural changes to doctoral programs (Kachelmeier, Madeo, Plumlee, Pratt, and Krull 2005). However, subsequent studies show that the shortage continues and the cumulative costs grow (e.g., Fogarty and Holder 2012; Brink, Glasscock, and Wier 2012). The Association to Advance Collegiate Schools of Business (AACSB) recently called for renewed attention to the problem (AACSB 2013b). We contribute to the literature by providing updated information regarding responses by doctoral programs and, from the eyes of potential candidates, of continuing impediments to solving the doctoral shortage. In this paper, we present information gathered through surveys of program administrators and master's and Accounting Doctoral Scholars Program (ADS) students. We explore (1) the cumulative impact of the Ph.D. shortage as of 2013, including its impact on accounting faculty composition, across different types of institutions, (2) negative student perceptions of Ph.D. programs and academic accounting careers, which discourage applicants from pursuing Ph.D. programs, and (3) impediments facing institutions in expanding doctoral programs.


2021 ◽  
Vol 10 (2) ◽  
pp. 91
Author(s):  
Triantafyllia-Maria Perivolioti ◽  
Antonios Mouratidis ◽  
Dimitrios Terzopoulos ◽  
Panagiotis Kalaitzis ◽  
Dimitrios Ampatzidis ◽  
...  

Covering an area of approximately 97 km2 and with a maximum depth of 58 m, Lake Trichonis is the largest and one of the deepest natural lakes in Greece. As such, it constitutes an important ecosystem and freshwater reserve at the regional scale, whose qualitative and quantitative properties ought to be monitored. Depth is a crucial parameter, as it is involved in both qualitative and quantitative monitoring aspects. Thus, the availability of a bathymetric model and a reliable DTM (Digital Terrain Model) of such an inland water body is imperative for almost any systematic observation scenario or ad hoc measurement endeavor. In this context, the purpose of this study is to produce a DTM from the only official cartographic source of relevant information available (dating back approximately 70 years) and evaluate its performance against new, independent, high-accuracy hydroacoustic recordings. The validation procedure involves the use of echosoundings coupled with GPS, and is followed by the production of a bathymetric model for the assessment of the discrepancies between the DTM and the measurements, along with the relevant morphometric analysis. Both the production and validation of the DTM are conducted in a GIS environment. The results indicate substantial discrepancies between the old DTM and contemporary acoustic data. A significant overall deviation of 3.39 ± 5.26 m in absolute bottom elevation differences and 0.00 ± 7.26 m in relative difference residuals (0.00 ± 2.11 m after 2nd polynomial model corrector surface fit) of the 2019 bathymetric dataset with respect to the ~1950 lake DTM and overall morphometry appear to be associated with a combination of tectonics, subsidence and karstic phenomena in the area. These observations could prove useful for the tectonics, geodynamics and seismicity with respect to the broader Corinth Rift region, as well as for environmental management and technical interventions in and around the lake. This dictates the necessity for new, extensive bathymetric measurements in order to produce an updated DTM of Lake Trichonis, reflecting current conditions and tailored to contemporary accuracy standards and state-of-the-art research in various disciplines in and around the lake.


2021 ◽  
pp. 000276422110216
Author(s):  
Kazimierz M. Slomczynski ◽  
Irina Tomescu-Dubrow ◽  
Ilona Wysmulek

This article proposes a new approach to analyze protest participation measured in surveys of uneven quality. Because single international survey projects cover only a fraction of the world’s nations in specific periods, researchers increasingly turn to ex-post harmonization of different survey data sets not a priori designed as comparable. However, very few scholars systematically examine the impact of the survey data quality on substantive results. We argue that the variation in source data, especially deviations from standards of survey documentation, data processing, and computer files—proposed by methodologists of Total Survey Error, Survey Quality Monitoring, and Fitness for Intended Use—is important for analyzing protest behavior. In particular, we apply the Survey Data Recycling framework to investigate the extent to which indicators of attending demonstrations and signing petitions in 1,184 national survey projects are associated with measures of data quality, controlling for variability in the questionnaire items. We demonstrate that the null hypothesis of no impact of measures of survey quality on indicators of protest participation must be rejected. Measures of survey documentation, data processing, and computer records, taken together, explain over 5% of the intersurvey variance in the proportions of the populations attending demonstrations or signing petitions.


Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1573
Author(s):  
Loris Nanni ◽  
Giovanni Minchio ◽  
Sheryl Brahnam ◽  
Gianluca Maguolo ◽  
Alessandra Lumini

Traditionally, classifiers are trained to predict patterns within a feature space. The image classification system presented here trains classifiers to predict patterns within a vector space by combining the dissimilarity spaces generated by a large set of Siamese Neural Networks (SNNs). A set of centroids from the patterns in the training data sets is calculated with supervised k-means clustering. The centroids are used to generate the dissimilarity space via the Siamese networks. The vector space descriptors are extracted by projecting patterns onto the similarity spaces, and SVMs classify an image by its dissimilarity vector. The versatility of the proposed approach in image classification is demonstrated by evaluating the system on different types of images across two domains: two medical data sets and two animal audio data sets with vocalizations represented as images (spectrograms). Results show that the proposed system’s performance competes competitively against the best-performing methods in the literature, obtaining state-of-the-art performance on one of the medical data sets, and does so without ad-hoc optimization of the clustering methods on the tested data sets.


Membranes ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 204
Author(s):  
Ievgen Pylypchuk ◽  
Roman Selyanchyn ◽  
Tetyana Budnyak ◽  
Yadong Zhao ◽  
Mikael Lindström ◽  
...  

Nanocellulose membranes based on tunicate-derived cellulose nanofibers, starch, and ~5% wood-derived lignin were investigated using three different types of lignin. The addition of lignin into cellulose membranes increased the specific surface area (from 5 to ~50 m2/g), however the fine porous geometry of the nanocellulose with characteristic pores below 10 nm in diameter remained similar for all membranes. The permeation of H2, CO2, N2, and O2 through the membranes was investigated and a characteristic Knudsen diffusion through the membranes was observed at a rate proportional to the inverse of their molecular sizes. Permeability values, however, varied significantly between samples containing different lignins, ranging from several to thousands of barrers (10−10 cm3 (STP) cm cm−2 s−1 cmHg−1cm), and were related to the observed morphology and lignin distribution inside the membranes. Additionally, the addition of ~5% lignin resulted in a significant increase in tensile strength from 3 GPa to ~6–7 GPa, but did not change thermal properties (glass transition or thermal stability). Overall, the combination of plant-derived lignin as a filler or binder in cellulose–starch composites with a sea-animal derived nanocellulose presents an interesting new approach for the fabrication of membranes from abundant bio-derived materials. Future studies should focus on the optimization of these types of membranes for the selective and fast transport of gases needed for a variety of industrial separation processes.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Maria Littmann ◽  
Michael Heinzinger ◽  
Christian Dallago ◽  
Tobias Olenyi ◽  
Burkhard Rost

AbstractKnowing protein function is crucial to advance molecular and medical biology, yet experimental function annotations through the Gene Ontology (GO) exist for fewer than 0.5% of all known proteins. Computational methods bridge this sequence-annotation gap typically through homology-based annotation transfer by identifying sequence-similar proteins with known function or through prediction methods using evolutionary information. Here, we propose predicting GO terms through annotation transfer based on proximity of proteins in the SeqVec embedding rather than in sequence space. These embeddings originate from deep learned language models (LMs) for protein sequences (SeqVec) transferring the knowledge gained from predicting the next amino acid in 33 million protein sequences. Replicating the conditions of CAFA3, our method reaches an Fmax of 37 ± 2%, 50 ± 3%, and 57 ± 2% for BPO, MFO, and CCO, respectively. Numerically, this appears close to the top ten CAFA3 methods. When restricting the annotation transfer to proteins with < 20% pairwise sequence identity to the query, performance drops (Fmax BPO 33 ± 2%, MFO 43 ± 3%, CCO 53 ± 2%); this still outperforms naïve sequence-based transfer. Preliminary results from CAFA4 appear to confirm these findings. Overall, this new concept is likely to change the annotation of proteins, in particular for proteins from smaller families or proteins with intrinsically disordered regions.


Electronics ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 739
Author(s):  
Nicholas Ayres ◽  
Lipika Deka ◽  
Daniel Paluszczyszyn

The vehicle-embedded system also known as the electronic control unit (ECU) has transformed the humble motorcar, making it more efficient, environmentally friendly, and safer, but has led to a system which is highly dependent on software. As new technologies and features are included with each new vehicle model, the increased reliance on software will no doubt continue. It is an undeniable fact that all software contains bugs, errors, and potential vulnerabilities, which when discovered must be addressed in a timely manner, primarily through patching and updates, to preserve vehicle and occupant safety and integrity. However, current automotive software updating practices are ad hoc at best and often follow the same inefficient fix mechanisms associated with a physical component failure of return or recall. Increasing vehicle connectivity heralds the potential for over the air (OtA) software updates, but rigid ECU hardware design does not often facilitate or enable OtA updating. To address the associated issues regarding automotive ECU-based software updates, a new approach in how automotive software is deployed to the ECU is required. This paper presents how lightweight virtualisation technologies known as containers can promote efficient automotive ECU software updates. ECU functional software can be deployed to a container built from an associated image. Container images promote efficiency in download size and times through layer sharing, similar to ECU difference or delta flashing. Through containers, connectivity and OtA future software updates can be completed without inconveniences to the consumer or incurring expense to the manufacturer.


Sign in / Sign up

Export Citation Format

Share Document