Smart containers for reagents delivery and release in bio-chemical computational systems

2014 ◽  
Vol 1 (1) ◽  
Author(s):  
Felix Evert ◽  
Svetlana Erokhina ◽  
Laura Pastorino
2012 ◽  
Vol 9 (1) ◽  
pp. 142-146
Author(s):  
O.A. Solnyshkina

In this work the 3D dynamics of two immiscible liquids in unbounded domain at low Reynolds numbers is considered. The numerical method is based on the boundary element method, which is very efficient for simulation of the three-dimensional problems in infinite domains. To accelerate calculations and increase the problem size, a heterogeneous approach to parallelization of the computations on the central (CPU) and graphics (GPU) processors is applied. To accelerate the iterative solver (GMRES) and overcome the limitations associated with the size of the memory of the computation system, the software component of the matrix-vector product


2020 ◽  
Vol 26 (18) ◽  
pp. 2109-2115 ◽  
Author(s):  
Mikhail A. Panteleev ◽  
Anna A. Andreeva ◽  
Alexey I. Lobanov

Discovery and selection of the potential targets are some of the important issues in pharmacology. Even when all the reactions and the proteins in a biological network are known, how does one choose the optimal target? Here, we review and discuss the application of the computational methods to address this problem using the blood coagulation cascade as an example. The problem of correct antithrombotic targeting is critical for this system because, although several anticoagulants are currently available, all of them are associated with bleeding risks. The advantages and the drawbacks of different sensitivity analysis strategies are considered, focusing on the approaches that emphasize: 1) the functional modularity and the multi-tasking nature of this biological network; and 2) the need to normalize hemostasis during the anticoagulation therapy rather than completely suppress it. To illustrate this effect, we show the possibility of the differential regulation of lag time and endogenous thrombin potential in the thrombin generation. These methods allow to identify the elements in the blood coagulation cascade that may serve as the targets for the differential regulation of this system.


2021 ◽  
Vol 11 (6) ◽  
pp. 2458
Author(s):  
Ronald Roberts ◽  
Laura Inzerillo ◽  
Gaetano Di Mino

Road networks are critical infrastructures within any region and it is imperative to maintain their conditions for safe and effective movement of goods and services. Road Management, therefore, plays a key role to ensure consistent efficient operation. However, significant resources are required to perform necessary maintenance activities to achieve and maintain high levels of service. Pavement maintenance can typically be very expensive and decisions are needed concerning planning and prioritizing interventions. Data are key towards enabling adequate maintenance planning but in many instances, there is limited available information especially in small or under-resourced urban road authorities. This study develops a roadmap to help these authorities by using flexible data analysis and deep learning computational systems to highlight important factors within road networks, which are used to construct models that can help predict future intervention timelines. A case study in Palermo, Italy was successfully developed to demonstrate how the techniques could be applied to perform appropriate feature selection and prediction models based on limited data sources. The workflow provides a pathway towards more effective pavement maintenance management practices using techniques that can be readily adapted based on different environments. This takes another step towards automating these practices within the pavement management system.


1999 ◽  
Vol 10 (07) ◽  
pp. 1205-1228 ◽  
Author(s):  
E. V. KRISHNAMURTHY

The important requirements are stated for the success of quantum computation. These requirements involve coherent preserving Hamiltonians as well as exact integrability of the corresponding Feynman path integrals. Also we explain the role of metric entropy in dynamical evolutionary system and outline some of the open problems in the design of quantum computational systems. Finally, we observe that unless we understand quantum nondemolition measurements, quantum integrability, quantum chaos and the direction of time arrow, the quantum control and computational paradigms will remain elusive and the design of systems based on quantum dynamical evolution may not be feasible.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2642
Author(s):  
Godwin Asaamoning ◽  
Paulo Mendes ◽  
Denis Rosário ◽  
Eduardo Cerqueira

The study of multi-agent systems such as drone swarms has been intensified due to their cooperative behavior. Nonetheless, automating the control of a swarm is challenging as each drone operates under fluctuating wireless, networking and environment constraints. To tackle these challenges, we consider drone swarms as Networked Control Systems (NCS), where the control of the overall system is done enclosed within a wireless communication network. This is based on a tight interconnection between the networking and computational systems, aiming to efficiently support the basic control functionality, namely data collection and exchanging, decision-making, and the distribution of actuation commands. Based on a literature analysis, we do not find revision papers about design of drone swarms as NCS. In this review, we introduce an overview of how to develop self-organized drone swarms as NCS via the integration of a networking system and a computational system. In this sense, we describe the properties of the proposed components of a drone swarm as an NCS in terms of networking and computational systems. We also analyze their integration to increase the performance of a drone swarm. Finally, we identify a potential design choice, and a set of open research challenges for the integration of network and computing in a drone swarm as an NCS.


Cancers ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 35
Author(s):  
Sahar Aghakhani ◽  
Naouel Zerrouk ◽  
Anna Niarakis

Fibroblasts, the most abundant cells in the connective tissue, are key modulators of the extracellular matrix (ECM) composition. These spindle-shaped cells are capable of synthesizing various extracellular matrix proteins and collagen. They also provide the structural framework (stroma) for tissues and play a pivotal role in the wound healing process. While they are maintainers of the ECM turnover and regulate several physiological processes, they can also undergo transformations responding to certain stimuli and display aggressive phenotypes that contribute to disease pathophysiology. In this review, we focus on the metabolic pathways of glucose and highlight metabolic reprogramming as a critical event that contributes to the transition of fibroblasts from quiescent to activated and aggressive cells. We also cover the emerging evidence that allows us to draw parallels between fibroblasts in autoimmune disorders and more specifically in rheumatoid arthritis and cancer. We link the metabolic changes of fibroblasts to the toxic environment created by the disease condition and discuss how targeting of metabolic reprogramming could be employed in the treatment of such diseases. Lastly, we discuss Systems Biology approaches, and more specifically, computational modeling, as a means to elucidate pathogenetic mechanisms and accelerate the identification of novel therapeutic targets.


Author(s):  
Teije de Jong

AbstractIn this series of papers I attempt to provide an answer to the question how the Babylonian scholars arrived at their mathematical theory of planetary motion. Papers I and II were devoted to system A theory of the outer planets and of the planet Venus. In this third and last paper I will study system A theory of the planet Mercury. Our knowledge of the Babylonian theory of Mercury is at present based on twelve Ephemerides and seven Procedure Texts. Three computational systems of Mercury are known, all of system A. System A1 is represented by nine Ephemerides covering the years 190 BC to 100 BC and system A2 by two Ephemerides covering the years 310 to 290 BC. System A3 is known from a Procedure Text and from Text M, an Ephemeris of the last evening visibility of Mercury for the years 424 to 403 BC. From an analysis of the Babylonian observations of Mercury preserved in the Astronomical Diaries and Planetary Texts we find: (1) that dates on which Mercury reaches its stationary points are not recorded, (2) that Normal Star observations on or near dates of first and last appearance of Mercury are rare (about once every twenty observations), and (3) that about one out of every seven pairs of first and last appearances is recorded as “omitted” when Mercury remains invisible due to a combination of the low inclination of its orbit to the horizon and the attenuation by atmospheric extinction. To be able to study the way in which the Babylonian scholars constructed their system A models of Mercury from the available observational material I have created a database of synthetic observations by computing the dates and zodiacal longitudes of all first and last appearances and of all stationary points of Mercury in Babylon between 450 and 50 BC. Of the data required for the construction of an ephemeris synodic time intervals Δt can be directly derived from observed dates but zodiacal longitudes and synodic arcs Δλ must be determined in some other way. Because for Mercury positions with respect to Normal Stars can only rarely be determined at its first or last appearance I propose that the Babylonian scholars used the relation Δλ = Δt −3;39,40, which follows from the period relations, to compute synodic arcs of Mercury from the observed synodic time intervals. An additional difficulty in the construction of System A step functions is that most amplitudes are larger than the associated zone lengths so that in the computation of the longitudes of the synodic phases of Mercury quite often two zone boundaries are crossed. This complication makes it difficult to understand how the Babylonian scholars managed to construct System A models for Mercury that fitted the observations so well because it requires an excessive amount of computational effort to find the best possible step function in a complicated trial and error fitting process with four or five free parameters. To circumvent this difficulty I propose that the Babylonian scholars used an alternative more direct method to fit System A-type models to the observational data of Mercury. This alternative method is based on the fact that after three synodic intervals Mercury returns to a position in the sky which is on average only 17.4° less in longitude. Using reduced amplitudes of about 14°–25° but keeping the same zone boundaries, the computation of what I will call 3-synarc system A models of Mercury is significantly simplified. A full ephemeris of a synodic phase of Mercury can then be composed by combining three columns of longitudes computed with 3-synarc step functions, each column starting with a longitude of Mercury one synodic event apart. Confirmation that this method was indeed used by the Babylonian astronomers comes from Text M (BM 36551+), a very early ephemeris of the last appearances in the evening of Mercury from 424 to 403 BC, computed in three columns according to System A3. Based on an analysis of Text M I suggest that around 400 BC the initial approach in system A modelling of Mercury may have been directed towards choosing “nice” sexagesimal numbers for the amplitudes of the system A step functions while in the later final models, dating from around 300 BC onwards, more emphasis was put on selecting numerical values for the amplitudes such that they were related by simple ratios. The fact that different ephemeris periods were used for each of the four synodic phases of Mercury in the later models may be related to the selection of a best fitting set of System A step function amplitudes for each synodic phase.


1994 ◽  
Vol 05 (02) ◽  
pp. 313-315 ◽  
Author(s):  
CHARLES H. ANDERSON

This paper explores some basic representational and implementation issues arising from the premise that cortical circuits operate on probability density functions to reason about analog quantities. Some insight is provided into why neurobiological systems can appear messy, while at the same time provide a rich and robust computational environment.


2011 ◽  
Vol 21 (2) ◽  
pp. 323-336 ◽  
Author(s):  
Nicola Angius ◽  
Guglielmo Tamburrini

Sign in / Sign up

Export Citation Format

Share Document