scholarly journals Reverse engineering and identification in systems biology: strategies, perspectives and challenges

2014 ◽  
Vol 11 (91) ◽  
pp. 20130505 ◽  
Author(s):  
Alejandro F. Villaverde ◽  
Julio R. Banga

The interplay of mathematical modelling with experiments is one of the central elements in systems biology. The aim of reverse engineering is to infer, analyse and understand, through this interplay, the functional and regulatory mechanisms of biological systems. Reverse engineering is not exclusive of systems biology and has been studied in different areas, such as inverse problem theory, machine learning, nonlinear physics, (bio)chemical kinetics, control theory and optimization, among others. However, it seems that many of these areas have been relatively closed to outsiders. In this contribution, we aim to compare and highlight the different perspectives and contributions from these fields, with emphasis on two key questions: (i) why are reverse engineering problems so hard to solve, and (ii) what methods are available for the particular problems arising from systems biology?

2008 ◽  
Vol 5 (suppl_1) ◽  
Author(s):  
Moisés Santillán ◽  
Michael C Mackey

In this paper, the history and importance of the lac operon in the development of molecular and systems biology are briefly reviewed. We start by presenting a description of the regulatory mechanisms in this operon, taking into account the most recent discoveries. Then we offer a survey of the history of the lac operon, including the discovery of its main elements and the subsequent influence on the development of molecular and systems biology. Next the bistable behaviour of the operon is discussed, both with respect to its discovery and its molecular origin. A review of the literature in which this bistable phenomenon has been studied from a mathematical modelling viewpoint is then given. We conclude with some brief remarks.


Fuels ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 286-303
Author(s):  
Vuong Van Pham ◽  
Ebrahim Fathi ◽  
Fatemeh Belyadi

The success of machine learning (ML) techniques implemented in different industries heavily rely on operator expertise and domain knowledge, which is used in manually choosing an algorithm and setting up the specific algorithm parameters for a problem. Due to the manual nature of model selection and parameter tuning, it is impossible to quantify or evaluate the quality of this manual process, which in turn limits the ability to perform comparison studies between different algorithms. In this study, we propose a new hybrid approach for developing machine learning workflows to help automated algorithm selection and hyperparameter optimization. The proposed approach provides a robust, reproducible, and unbiased workflow that can be quantified and validated using different scoring metrics. We have used the most common workflows implemented in the application of artificial intelligence (AI) and ML in engineering problems including grid/random search, Bayesian search and optimization, genetic programming, and compared that with our new hybrid approach that includes the integration of Tree-based Pipeline Optimization Tool (TPOT) and Bayesian optimization. The performance of each workflow is quantified using different scoring metrics such as Pearson correlation (i.e., R2 correlation) and Mean Square Error (i.e., MSE). For this purpose, actual field data obtained from 1567 gas wells in Marcellus Shale, with 121 features from reservoir, drilling, completion, stimulation, and operation is tested using different proposed workflows. A proposed new hybrid workflow is then used to evaluate the type well used for evaluation of Marcellus shale gas production. In conclusion, our automated hybrid approach showed significant improvement in comparison to other proposed workflows using both scoring matrices. The new hybrid approach provides a practical tool that supports the automated model and hyperparameter selection, which is tested using real field data that can be implemented in solving different engineering problems using artificial intelligence and machine learning. The new hybrid model is tested in a real field and compared with conventional type wells developed by field engineers. It is found that the type well of the field is very close to P50 predictions of the field, which shows great success in the completion design of the field performed by field engineers. It also shows that the field average production could have been improved by 8% if shorter cluster spacing and higher proppant loading per cluster were used during the frac jobs.


Energies ◽  
2021 ◽  
Vol 14 (4) ◽  
pp. 930
Author(s):  
Fahimeh Hadavimoghaddam ◽  
Mehdi Ostadhassan ◽  
Ehsan Heidaryan ◽  
Mohammad Ali Sadri ◽  
Inna Chapanova ◽  
...  

Dead oil viscosity is a critical parameter to solve numerous reservoir engineering problems and one of the most unreliable properties to predict with classical black oil correlations. Determination of dead oil viscosity by experiments is expensive and time-consuming, which means developing an accurate and quick prediction model is required. This paper implements six machine learning models: random forest (RF), lightgbm, XGBoost, multilayer perceptron (MLP) neural network, stochastic real-valued (SRV) and SuperLearner to predict dead oil viscosity. More than 2000 pressure–volume–temperature (PVT) data were used for developing and testing these models. A huge range of viscosity data were used, from light intermediate to heavy oil. In this study, we give insight into the performance of different functional forms that have been used in the literature to formulate dead oil viscosity. The results show that the functional form f(γAPI,T), has the best performance, and additional correlating parameters might be unnecessary. Furthermore, SuperLearner outperformed other machine learning (ML) algorithms as well as common correlations that are based on the metric analysis. The SuperLearner model can potentially replace the empirical models for viscosity predictions on a wide range of viscosities (any oil type). Ultimately, the proposed model is capable of simulating the true physical trend of the dead oil viscosity with variations of oil API gravity, temperature and shear rate.


Author(s):  
Ishtiaque Ahammad

Axon guidance is a crucial process for growth of the central and peripheral nervous systems. In this study, 3 axon guidance related disorders, namely- Duane Retraction Syndrome (DRS) , Horizontal Gaze Palsy with Progressive Scoliosis (HGPPS) and Congenital fibrosis of the extraocular muscles type 3 (CFEOM3) were studied using various Systems Biology tools to identify the genes and proteins involved with them to get a better idea about the underlying molecular mechanisms including the regulatory mechanisms. Based on the analyses carried out, 7 significant modules have been identified from the PPI network. Five pathways/processes have been found to be significantly associated with DRS, HGPPS and CFEOM3 associated genes. From the PPI network, 3 have been identified as hub proteins- DRD2, UBC and CUL3.


Author(s):  
Nikolay I. Kol'tsov

A simple effective method for solving the inverse problem of chemical kinetics based on non-stationary experiments for multistage reactions occurring in an isothermal reactor of ideal mixing is described. The idea of the method is based on taking into account the distinctive features (informativeness) of different fragments of relaxation curves for chemical reactions with arbitrary (non-monotonic) kinetics and their as accurate approximation as possible. For this purpose, non-linear (cubic) splines are used to describe different informative fragments of relaxation curves, which allow to approximate and interpolate experimental data as accurately as possible. An additional advantage of cubic splines, from the point of view of the implementation of the described method, is their continuity at all given points up to and including second-order derivatives (smoothness). This allows us to calculate with good accuracy not only the concentration of reagents, but also the instantaneous rate of change at any time. The consequence of this is the possibility of a sufficiently accurate solution of the inverse problem based on the data of non-stationary experiments. The correctness of the mathematical model used and the stability of the method were tested using variations of the original data. An example of using the method for determining the intervals of physical values of the rate constants of the stages of a two-stage reaction is given. The influence of the method of selecting the reference points (structure) of the spline and measurement errors (noise) of experimental data on the error of determining the speed constants of the stages is estimated. The efficiency of application and good accuracy of the method for solving the inverse problem of chemical kinetics of multistage reactions occurring in non-gradient systems with taking into account of noise is shown.


2021 ◽  
Author(s):  
Tasnuva Farheen ◽  
Ulbert Botero ◽  
Nitin Varshney ◽  
Damon L. Woodard ◽  
Mark Tehranipoor ◽  
...  

Abstract IC camouflaging has been proposed as a promising countermeasure against malicious reverse engineering. Camouflaged gates contain multiple functional device structures, but appear as one single layout under microscope imaging, thereby hiding the real circuit functionality from adversaries. The recent covert gate camouflaging design comes with a significantly reduced overhead cost, allowing numerous camouflaged gates in circuits and thus being resilient against various invasive and semi-invasive attacks. Dummy inputs are used in the design, but SEM imaging analysis was only performed on simplified dummy contact structures in prior work. Whether the e-beam during SEM imaging will charge differently on different contacts and further reveal the different structures or not requires extended research. In this study, we fabricated real and dummy contacts in various structures and performed a systematic SEM imaging analysis to investigate the possible charging and the consequent passive voltage contrast on contacts. In addition, machine-learning based pattern recognition was also employed to examine the possibility of differentiating real and dummy contacts. Based on our experimental results, we found that the difference between real and dummy contacts is insignificant in SEM imaging, which effectively prevents adversarial SEM-based reverse engineering. Index Terms—Reverse Engineering, IC Camouflaging, Scanning Electron Microscopy, Machine Learning, Countermeasure.


Sign in / Sign up

Export Citation Format

Share Document