scholarly journals Low-cost arabinose induced genetic circuit for protein production: Modeling.

2022 ◽  
Author(s):  
Christian Jesus Flores Gomez ◽  
Edgar Valeria de la Cruz ◽  
Jorge Luis Garcia Barrera

A low-cost reagent-producing genetic circuit was designed during this work. Its functioning is based on a positive feedback loop induced by a small amount of arabinose, allowing users to obtain reactants in a safe, constant, and controlled manner. The design-only approach to the project allows us to work in different kinds of computational models, thus, an ODE-based model was thoroughly developed and a cellular automata-based one was experimented with. Working on the ODE model, equilibrium states and system stability were studied. Circuit properties were also focused on one of which was a high concentration of interest protein produced by low inductor inputs. As a result, a mathematical expression capable of describing the quantity of produced reagent was obtained. In addition, the cellular automata model offers a new perspective, given its differences from the ODE model e.g. this type of model is a stochastic analysis and describes each cell individually instead of describing the whole cellular population.

Chemosphere ◽  
2021 ◽  
Vol 274 ◽  
pp. 129689
Author(s):  
Jianpei Feng ◽  
Xiaolei Zhang ◽  
Guan Zhang ◽  
Ji Li ◽  
Wei Song ◽  
...  

2021 ◽  
pp. 096739112110245
Author(s):  
Amrita Sharma ◽  
PP Pande

It has been observed that acrylate monomers are very difficult to polymerize with the low cost nitroxide catalyst 2,2,6,6-tetramethylpiperidinyl-1-oxyl (TEMPO). Therefore, costly acyclic nitroxides such as N-tert-butyl-N-(1-diethylphosphono-2,2-dimethyl)-N-oxyl, (SG1), 2,2,5-Trimethyl-4-phenyl-3-azahexane-3-nitroxide (TIPNO) and TIPNO derivatives have to be used for the polymerization of the acrylic acid derivatives. There are very few reports on the use of TEMPO-derivatives toward the polymerization of n-butyl acrylate. Generally different reducing agents viz. glucose, ascorbic acid, hydroxyacetone etc. have been used to destroy excess TEMPO during the polymerization reaction. The acrylate polymerizations fail in the presence of TEMPO due to the strong C–O bond formed between the acrylate chain end and nitroxide. To the best of our knowledge, no literature report is available on the use of TEMPO without reducing agent or high temperature initiators, toward the polymerization of n-butyl acrylate. The present study has been carried out with a view to re-examine the application of low cost nitroxide TEMPO, so that it can be utilized towards the polymerization of acrylate monomers (e.g. n-butyl acrylate). We have been able to polymerize n-butyl acrylate using the nitroxide TEMPO as initiator (via a macroinitiator). In this synthesis, a polystyrene macroinitiator was synthesized in the first step from TEMPO, after this TEMPO end-capped styrene macroinitiator (PSt-TEMPO) is used to polymerize n-butyl acrylate monomer. The amount of macroinitiator taken was varied from 0.05% to 50% by weight of n-butyl acrylate monomer. The polymerization was carried out at 120°C by bulk polymerization method. The experimental findings showed a gradual increase in molecular weight of the polymer formed and decrease in the polydispersity index (PDI) with increase in amount of PSt-TEMPO macroinitiator taken. In all experiments conversion was more than 80%. These results indicate that the polymerization takes place through controlled polymerization process. Effect of different solvents on polymerization has also been investigated. In the following experiments TEMPO capped styrene has been used as macroinitiator leading to the successful synthesis of poly n-Butyl acrylate. It has been found that styrene macroinitiator is highly efficient for the nitroxide mediated polymerization, even in very small concentration for the synthesis of poly n-butyl acrylate. High concentration of macroinitiator results in the formation of block copolymers of polystyrene and poly ( n-butyl acrylate) viz. polystyrene-block-poly-( n-butyl acrylate). The use of TEMPO toward controlled polymerization is of much importance, because it is the nitroxide commercially available at the lowest cost.


2017 ◽  
Vol 11 (7) ◽  
pp. 1 ◽  
Author(s):  
Yi-Jian Liu ◽  
Jian Cao ◽  
Xiao-Yan Cao ◽  
Yuan-Biao Zhang

As an important field in traffic control science, the research in design of toll plazas has increasingly attracted attention of scholars and society. A good design of toll plaza needs to meet a lot of conditions, such as high safety coefficient, high throughput and low cost level. In this study, we established an evaluation model of toll plaza based on cellular automata and M/M/C queuing theory applying to three aspects: safety coefficient, throughput and cost. Then, we took the Asbury Park Toll Plaza in New Jersey as an example to analyze its performance and further optimized the design of the toll plaza. Compared with the original design, the optimized toll plaza we designed is proved to be safer and preferable. Last but not least, we further analyzed the robustness of the designed toll plaza, proving that the designed toll plaza had a preferable performance in reality.


2018 ◽  
Author(s):  
Yi-Jie Zhao ◽  
Tianye Ma ◽  
Xuemei Ran ◽  
Li Zhang ◽  
Ru-Yuan Zhang ◽  
...  

AbstractSchizophrenia patients are known to have profound deficits in visual working memory (VWM), and almost all previous studies attribute the deficits to decreased memory capacity. This account, however, ignores the potential contributions of other VWM components (e.g., memory precision). Here, we measure the VWM performance of schizophrenia patients and healthy control subjects on two classical delay-estimation tasks. Moreover, we thoroughly evaluate several established computational models of VWM to compare the performance of the two groups. We find that the model assuming variable precision across items and trials is the best model to explain the performance of both groups. According to the variable-precision model, schizophrenia subjects exhibit abnormally larger variability of allocating memory resources rather than resources per se. These results invite a rethink of the widely accepted decreased-capacity theory and propose a new perspective on the diagnosis and rehabilitation of schizophrenia.


2021 ◽  
Author(s):  
Valeria Lupiano ◽  
Claudia Calidonna ◽  
Paolo Catelan ◽  
Francesco Chidichimo ◽  
Gino Mirocle Crisci ◽  
...  

<p>Lahars represent one of the world destructive natural phenomena as number of casualties (Manville et al., 2013). Lahars originate as mixtures of water and volcanic deposits frequently by heavy rainfalls; they are erosive floods capable of increase in volume along its path to more than 10 times their initial size, moving up to 100 km/h in steeply sloping as far as an extreme distance of hundreds of kilometers.</p><p>Beside tools of early warning, security measures have been adopted in volcanic territory, by constructing retaining dams and embankments in key positions for containing and deviating possible lahars (Leung et al., 2003). This solution could involve a strong environmental impact both for the works and the continuous accumulation of volcanic deposits, such that equilibrium conditions could lack far, triggering more disastrous events.</p><p>The growing frequency of lahars in the Vascún Valley area, Tungurahua Volcano Ecuador, maybe for the climatic change, has recently produced smaller (shorter accumulation periods) and therefore less dangerous events.</p><p>Momentary ponds form along rivers in volcanic areas, when they become usually blocked by landslides of volcanic deposits, which are originated by pyroclastic flows and lahars. The most frequent cause of a breakout of such natural ponds is the overflow of water across the newly formed dam and subsequent erosion and rapid downcutting into the loose rock debris.</p><p>Dam collapse can occur by sliding of the volcanic deposit or by its overturning. By eroding the blockage and flowing out river channel downstream, the initial surge of water will incorporate a dangerous volume of sediments. This produces lahars with possible devastating effects for settlements in their path (Leung et al., 2003).</p><p>The use of simulation tools (from the cellular automata model LLUNPIY) and field data (including necessary subsoil survey) permit to individuate points, where dams by backfills, easy to collapse, can produce momentary ponds.</p><p>Small temporary dams with similar (but controlled) behavior of above mentioned dams can be designed and built at low cost by local backfills in order to allow the outflow of streams produced by regular rainfall events. This result is achieved by properly dimensioning a discharge channel at the dam base (Lupiano et al., 2020).</p><p>So small lahars can be triggered for minor rainfall events, lahar detachments can be anticipated for major events, avoiding simultaneous confluence with other lahars (Lupiano et al., 2020).</p><p><strong>REFERENCES</strong></p><p>Leung, MF, Santos, JR, Haimes, YY (2003). Risk modeling, assessment, and management of lahar flow threat. Risk Analysis, 23(6), 1323-1335.</p><p>Lupiano, V., Chidichimo, F., Machado, G., Catelan, P., Molina, L., Calidonna, C.R., Straface, S., Crisci, G. M., And Di Gregorio, S. (2020) - From examination of natural events to a proposal for risk mitigation of lahars by a cellular-automata methodology: a case study for Vascún valley, Ecuador. Nat. Hazards Earth Syst. Sci., 20, 1–20, 2020.</p><p>Manville, V., Major, J.J. and Fagents, S.A. (2013). Modeling lahar behavior and hazards. in Fagents, SA, Gregg, TKP, and Lopes, RMC (eds.) Modeling Volcanic Processes: The Physics and Mathematics of Volcanism. Cambridge: Cambridge University Press, pp. 300–330.</p>


2021 ◽  
Author(s):  
Rodrigo Rivera Martinez ◽  
Diego Santaren ◽  
Olivier Laurent ◽  
Ford Cropley ◽  
Cecile Mallet ◽  
...  

<p>Deploying a dense network of sensors around emitting industrial facilities allows to detect and quantify possible CH<sub>4 </sub>leaks and monitor the emissions continuously. Designing such a monitoring network with highly precise instruments is limited by the elevated cost of instruments, requirements of power consumption and maintenance. Low cost and low power metal oxide sensor could come handy to be an alternative to deploy this kind of network at a fraction of the cost with satisfactory quality of measurements for such applications.</p><p>Recent studies have tested Metal Oxide Sensors (MO<sub>x</sub>) on natural and controlled conditions to measure atmospheric methane concentrations and showed a fair agreement with high precision instruments, such as those from Cavity Ring Down Spectrometers (CRDS). Such results open perspectives regarding the potential of MOx to be employed as an alternative to measure and quantify CH<sub>4</sub> emissions on industrial facilities. However, such sensors are known to drift with time, to be highly sensitive to water vapor mole fraction, have a poor selectivity with several known cross-sensitivities to other species and present significant sensitivity environmental factors like temperature and pressure. Different approaches for the derivation of CH<sub>4</sub> mole fractions from the MO<sub>x</sub> signal and ancillary parameter measurements have been employed to overcome these problems, from traditional approaches like linear or multilinear regressions to machine learning (ANN, SVM or Random Forest).</p><p>Most studies were focused on the derivation of ambient CH<sub>4</sub> concentrations under different conditions, but few tests assessed the performance of these sensors to capture CH<sub>4</sub> variations at high frequency, with peaks of elevated concentrations, which corresponds well with the signal observed from point sources in industrial sites presenting leakage and isolated methane emission. We conducted a continuous controlled experiment over four months (from November 2019 to February 2020) in which three types of MOx Sensors from Figaro® measured high frequency CH<sub>4</sub> peaks with concentrations varying between atmospheric background levels up to 24 ppm at LSCE, Saclay, France. We develop a calibration strategy including a two-step baseline correction and compared different approaches to reconstruct CH<sub>4</sub> spikes such as linear, multilinear and polynomial regression, and ANN and random forest algorithms. We found that baseline correction in the pre-processing stage improved the reconstruction of CH<sub>4</sub> concentrations in the spikes. The random forest models performed better than other methods achieving a mean RMSE = 0.25 ppm when reconstructing peaks amplitude over windows of 4 days. In addition, we conducted tests to determine the minimum amount of data required to train successful models for predicting CH<sub>4</sub> spikes, and the needed frequency of re-calibration / re-training under these controlled circumstances. We concluded that for a target RMSE <= 0.3 ppm at a measurement frequency of 5s, 4 days of training are required, and a recalibration / re-training is recommended every 30 days.</p><p>Our study presents a new approach to process and reconstruct observations from low cost CH<sub>4</sub> sensors and highlights its potential to quantify high concentration releases in industrial facilities.</p>


2021 ◽  
Vol 231 ◽  
pp. 359-366
Author(s):  
Banan Hudaib ◽  
Ali F. Al-Shawabkeh ◽  
Waid Omar ◽  
Habis Al-Zoubi ◽  
Rund Abu-Zurayk

2000 ◽  
Author(s):  
Salvatore Torquato ◽  
Thomas S. Deisboeck

Abstract Intensive medical research over the last fifty years has left the prognosis for patients diagnosed with malignant brain tumors nearly unchanged. This suggests that a new perspective on the problem may offer important insight. We have undertaken an interdisciplinary research program, seeking to study brain tumors as complex systems. This research aims to develop computational models coupled with experimental assays to investigate the hypothesis of self-organizing behavior in tumor systems. Preliminary assays have revealed behavior consistent with this hypothesis. A cellular-automaton model to study the growth of the tumor core has been developed. This model has proven successful in reproducing macroscopic tumor growth from a limited parameter set. Further, it has been applied to investigate the importance of heterogeneity to determination of a clinical prognosis and has demonstrated the importance of understanding clonal composition in making an accurate prognosis.


Author(s):  
Joseph Brenner

The conjunction of the disciplines of computing and philosophy implies that discussion of computational models and approaches should include explicit statements of their underlying worldview, given the fact that reality includes both computational and non-computational domains. As outlined at ECAP08, both domains of reality can be characterized by the different logics applicable to them. A new “Logic in Reality” (LIR) was proposed as best describing the dynamics of real, non-computable processes. The LIR process view of the real macroscopic world is compared here with recent computational and information-theoretic models. Proposals that the universe can be described as a mathematical structure equivalent to a computer or by simple cellular automata are deflated. A new interpretation of quantum superposition as supporting a concept of paraconsistent parallelism in quantum computing and an appropriate ontological commitment for computational modeling are discussed.


Sign in / Sign up

Export Citation Format

Share Document