scholarly journals Workflow for Biocatalytic Reaction Performance Prediction and Analysis

Author(s):  
Hanna Clements ◽  
Autumn Flynn ◽  
Bryce Nicholls ◽  
Daria Grosheva ◽  
Todd Hyster ◽  
...  

The development of predictive tools to assess enzyme mutant performance and physical organic approaches to enzyme mechanistic interrogation are crucial to the field of biocatalysis. While many indispensable tools exist to address qualitative aspects of biocatalytic reaction design, they often require extensive experimental data sets or a priori knowledge of reaction mechanism. However, quantitative prediction of enzyme performance is lacking. Herein, we present a workflow that merges both computational and experimental data to produce statistical models that predict the performance of new substrates and enzyme mutants while also providing insight into reaction mechanism. As a validating case study, this platform was applied to investigate a non-native enantioselective photoenzymatic radical cyclization. Statistical models enabled interrogation of the reaction mechanism, and the predictive capabilities of these same models led to the quantitative prediction of the enantioselectivities of new substrates with several enzyme mutants. This platform was constructed for application to any biocatalytic system wherein mechanistic interrogation, prediction of reaction performance with new substrates, or quantitative performance of enzyme mutants would be desirable. Overall, this proof of concept study provides a new tool to complement existing protein engineering and reaction design strategies.

Author(s):  
P. Wiącek

Abstract. Due to the increasing range of work carried out with UAV in recent years, the importance of final product accuracy appreciates. However, obtaining survey-grade accuracy requires to perform bundle adjustment processes that could be affected by multiple factors like unstable camera calibration, a correlation between interior and exterior orientation, insufficient georeferenced information, and software settings. During the project, multi-variant flight over the test field was conducted. The flights were performed with a fixed-wing airframe equipped with PPK receiver on-board. Based on the conducted flights, the database for multifactorial data sets has been prepared. The database containing hundreds of independent adjustment variants which differ as follows: georeferencing method, flight configuration, additional camera calibration corrections, tie points filtering, and a priori accuracy settings. The database allowed to investigate the separate influence of each factor on the final results using ANOVA statistical models.


2020 ◽  
Author(s):  
Coraline Mattei ◽  
Manabu Shiraiwa ◽  
Ulrich Pöschl ◽  
Thomas Berkemeier

<p>The ozonolysis of oleic acid on aerosol particles has been extensively studied in the past and is often used as a benchmark reaction for the study of organic particle oxidation. However, to date, no single kinetic model has reconciled the vastly differing reactive uptake coefficients reported in the literature that were obtained at different oxidant concentrations, particle sizes and with various commonly used laboratory setups (single-particle trap, aerosol flow tube, and environmental chamber). We combine the kinetic multi-layer model of aerosol surface and bulk chemistry (KM-SUB, Shiraiwa et al. 2012) with the Monte Carlo Genetic Algorithm (MCGA, Berkemeier et al. 2017) to simultaneously describe nine experimental data sets with a single set of kinetic parameters. The KM-SUB model treats chemistry and mass transport of reactants and products in the gas and particle phases explicitly, based on molecular-level chemical and physical properties. The MCGA algorithm is a global optimization routine that aids in unbiased determination of these model parameters and can be used to assess parameter uncertainty. This methodology enables us to derive information from laboratory experiments using a “big data approach” by accounting for a large amount of data at the same time.</p><p>We show that a simple reaction mechanism including the surface and bulk ozonolysis of oleic acid only allows for the reconciliation of some of the data sets. An accurate description of the entire reaction system can only be accomplished if secondary chemistry is considered and present an extended reaction mechanism including reactive oxygen intermediates. The presence of reactive oxygen species on surfaces of particulate matter might play an important role in understanding aerosol surface phenomena, organic aerosol evolution, and their health effects.</p><p> </p><p>References</p><p>Berkemeier, T. et al.: Technical note: Monte Carlo genetic algorithm (MCGA) for model analysis of multiphase chemical kinetics to determine transport and reaction rate coefficients using multiple experimental data sets, Atmos. Chem. Phys., 17, 8021-8029, 2017.</p><p>Shiraiwa, M., Pfrang, C., and Pöschl, U.: Kinetic multi-layer model of aerosol surface and bulk chemistry (KM-SUB): the influence of interfacial transport and bulk diffusion on the oxidation of oleic acid by ozone, Atmos. Chem. Phys., 10, 3673-3691, 2010.</p>


Author(s):  
Cyprian Suchocki ◽  
Stanisław Jemioło

AbstractIn this work a number of selected, isotropic, invariant-based hyperelastic models are analyzed. The considered constitutive relations of hyperelasticity include the model by Gent (G) and its extension, the so-called generalized Gent model (GG), the exponential-power law model (Exp-PL) and the power law model (PL). The material parameters of the models under study have been identified for eight different experimental data sets. As it has been demonstrated, the much celebrated Gent’s model does not always allow to obtain an acceptable quality of the experimental data approximation. Furthermore, it is observed that the best curve fitting quality is usually achieved when the experimentally derived conditions that were proposed by Rivlin and Saunders are fulfilled. However, it is shown that the conditions by Rivlin and Saunders are in a contradiction with the mathematical requirements of stored energy polyconvexity. A polyconvex stored energy function is assumed in order to ensure the existence of solutions to a properly defined boundary value problem and to avoid non-physical material response. It is found that in the case of the analyzed hyperelastic models the application of polyconvexity conditions leads to only a slight decrease in the curve fitting quality. When the energy polyconvexity is assumed, the best experimental data approximation is usually obtained for the PL model. Among the non-polyconvex hyperelastic models, the best curve fitting results are most frequently achieved for the GG model. However, it is shown that both the G and the GG models are problematic due to the presence of the locking effect.


2021 ◽  
pp. 000276422110216
Author(s):  
Kazimierz M. Slomczynski ◽  
Irina Tomescu-Dubrow ◽  
Ilona Wysmulek

This article proposes a new approach to analyze protest participation measured in surveys of uneven quality. Because single international survey projects cover only a fraction of the world’s nations in specific periods, researchers increasingly turn to ex-post harmonization of different survey data sets not a priori designed as comparable. However, very few scholars systematically examine the impact of the survey data quality on substantive results. We argue that the variation in source data, especially deviations from standards of survey documentation, data processing, and computer files—proposed by methodologists of Total Survey Error, Survey Quality Monitoring, and Fitness for Intended Use—is important for analyzing protest behavior. In particular, we apply the Survey Data Recycling framework to investigate the extent to which indicators of attending demonstrations and signing petitions in 1,184 national survey projects are associated with measures of data quality, controlling for variability in the questionnaire items. We demonstrate that the null hypothesis of no impact of measures of survey quality on indicators of protest participation must be rejected. Measures of survey documentation, data processing, and computer records, taken together, explain over 5% of the intersurvey variance in the proportions of the populations attending demonstrations or signing petitions.


2014 ◽  
Vol 11 (2) ◽  
pp. 68-79
Author(s):  
Matthias Klapperstück ◽  
Falk Schreiber

Summary The visualization of biological data gained increasing importance in the last years. There is a large number of methods and software tools available that visualize biological data including the combination of measured experimental data and biological networks. With growing size of networks their handling and exploration becomes a challenging task for the user. In addition, scientists also have an interest in not just investigating a single kind of network, but on the combination of different types of networks, such as metabolic, gene regulatory and protein interaction networks. Therefore, fast access, abstract and dynamic views, and intuitive exploratory methods should be provided to search and extract information from the networks. This paper will introduce a conceptual framework for handling and combining multiple network sources that enables abstract viewing and exploration of large data sets including additional experimental data. It will introduce a three-tier structure that links network data to multiple network views, discuss a proof of concept implementation, and shows a specific visualization method for combining metabolic and gene regulatory networks in an example.


2015 ◽  
Vol 2015 ◽  
pp. 1-13
Author(s):  
Jianwei Ding ◽  
Yingbo Liu ◽  
Li Zhang ◽  
Jianmin Wang

Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM), which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis). Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.


2014 ◽  
Vol 14 (23) ◽  
pp. 12613-12629 ◽  
Author(s):  
P. Eriksson ◽  
B. Rydberg ◽  
H. Sagawa ◽  
M. S. Johnston ◽  
Y. Kasai

Abstract. Retrievals of cloud ice mass and humidity from the Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES) and the Odin-SMR (Sub-Millimetre Radiometer) limb sounder are presented and example applications of the data are given. SMILES data give an unprecedented view of the diurnal variation of cloud ice mass. Mean regional diurnal cycles are reported and compared to some global climate models. Some improvements in the models regarding diurnal timing and relative amplitude were noted, but the models' mean ice mass around 250 hPa is still low compared to the observations. The influence of the ENSO (El Niño–Southern Oscillation) state on the upper troposphere is demonstrated using 12 years of Odin-SMR data. The same retrieval scheme is applied for both sensors, and gives low systematic differences between the two data sets. A special feature of this Bayesian retrieval scheme, of Monte Carlo integration type, is that values are produced for all measurements but for some atmospheric states retrieved values only reflect a priori assumptions. However, this "all-weather" capability allows a direct statistical comparison to model data, in contrast to many other satellite data sets. Another strength of the retrievals is the detailed treatment of "beam filling" that otherwise would cause large systematic biases for these passive cloud ice mass retrievals. The main retrieval inputs are spectra around 635/525 GHz from tangent altitudes below 8/9 km for SMILES/Odin-SMR, respectively. For both sensors, the data cover the upper troposphere between 30° S and 30° N. Humidity is reported as both relative humidity and volume mixing ratio. The vertical coverage of SMILES is restricted to a single layer, while Odin-SMR gives some profiling capability between 300 and 150 hPa. Ice mass is given as the partial ice water path above 260 hPa, but for Odin-SMR ice water content, estimates are also provided. Besides a smaller contrast between most dry and wet cases, the agreement with Aura MLS (Microwave Limb Sounder) humidity data is good. In terms of tropical mean humidity, all three data sets agree within 3.5 %RHi. Mean ice mass is about a factor of 2 lower compared to CloudSat. This deviation is caused by the fact that different particle size distributions are assumed, combined with saturation and a priori influences in the SMILES and Odin-SMR data.


2015 ◽  
Vol 24 (07) ◽  
pp. 1550050 ◽  
Author(s):  
E. Matsinos ◽  
G. Rasche

In a previous paper, we reported the results of a partial-wave analysis (PWA) of the pion–nucleon (πN) differential cross-sections (DCSs) of the CHAOS Collaboration and came to the conclusion that the angular distribution of their π+p data sets is incompatible with the rest of the modern (meson factory) database. The present work, re-addressing this issue, has been instigated by a number of recent improvements in our analysis, namely regarding the inclusion of the theoretical uncertainties when investigating the reproduction of experimental data sets on the basis of a given "theoretical" solution, modifications in the parametrization of the form factors of the proton and of the pion entering the electromagnetic part of the πN amplitude, and the inclusion of the effects of the variation of the σ-meson mass when fitting the ETH model of the πN interaction to the experimental data. The new analysis of the CHAOS DCSs confirms our earlier conclusions and casts doubt on the value for the πN Σ term, which Stahov, Clement and Wagner have extracted from these data.


2017 ◽  
Vol 24 (3) ◽  
pp. 543-551 ◽  
Author(s):  
Vladimir Y. Zaitsev ◽  
Andrey V. Radostin ◽  
Elena Pasternak ◽  
Arcady Dyskin

Abstract. Results of examination of experimental data on non-linear elasticity of rocks using experimentally determined pressure dependences of P- and S-wave velocities from various literature sources are presented. Overall, over 90 rock samples are considered. Interpretation of the data is performed using an effective-medium description in which cracks are considered as compliant defects with explicitly introduced shear and normal compliances without specifying a particular crack model with an a priori given ratio of the compliances. Comparison with the experimental data indicated abundance (∼ 80 %) of cracks with the normal-to-shear compliance ratios that significantly exceed the values typical of conventionally used crack models (such as penny-shaped cuts or thin ellipsoidal cracks). Correspondingly, rocks with such cracks demonstrate a strongly decreased Poisson ratio including a significant (∼ 45 %) portion of rocks exhibiting negative Poisson ratios at lower pressures, for which the concentration of not yet closed cracks is maximal. The obtained results indicate the necessity for further development of crack models to account for the revealed numerous examples of cracks with strong domination of normal compliance. Discovering such a significant number of naturally auxetic rocks is in contrast to the conventional viewpoint that occurrence of a negative Poisson ratio is an exotic fact that is mostly discussed for artificial structures.


Sign in / Sign up

Export Citation Format

Share Document