scholarly journals Biology by Design: From Top to Bottom and Back

2010 ◽  
Vol 2010 ◽  
pp. 1-11 ◽  
Author(s):  
Brian R. Fritz ◽  
Laura E. Timmerman ◽  
Nichole M. Daringer ◽  
Joshua N. Leonard ◽  
Michael C. Jewett

Synthetic biology is a nascent technical discipline that seeks to enable the design and construction of novel biological systems to meet pressing societal needs. However, engineering biology still requires much trial and error because we lack effective approaches for connecting basic “parts” into higher-order networks that behave as predicted. Developing strategies for improving the performance and sophistication of our designs is informed by two overarching perspectives: “bottom-up” and “top-down” considerations. Using this framework, we describe a conceptual model for developing novel biological systems that function and interact with existing biological components in a predictable fashion. We discuss this model in the context of three topical areas: biochemical transformations, cellular devices and therapeutics, and approaches that expand the chemistry of life. Ten years after the construction of synthetic biology's first devices, the drive to look beyond what does exist to what can exist is ushering in an era of biology by design.

Science ◽  
2011 ◽  
Vol 333 (6047) ◽  
pp. 1252-1254 ◽  
Author(s):  
Petra Schwille

How synthetic can “synthetic biology” be? A literal interpretation of the name of this new life science discipline invokes expectations of the systematic construction of biological systems with cells being built module by module—from the bottom up. But can this possibly be achieved, taking into account the enormous complexity and redundancy of living systems, which distinguish them quite remarkably from design features that characterize human inventions? There are several recent developments in biology, in tight conjunction with quantitative disciplines, that may bring this literal perspective into the realm of the possible. However, such bottom-up engineering requires tools that were originally designed by nature’s greatest tinkerer: evolution.


2010 ◽  
Vol 16 (1) ◽  
pp. 89-97 ◽  
Author(s):  
Mark A. Bedau ◽  
John S. McCaskill ◽  
Norman H. Packard ◽  
Steen Rasmussen

The concept of living technology—that is, technology that is based on the powerful core features of life—is explained and illustrated with examples from artificial life software, reconfigurable and evolvable hardware, autonomously self-reproducing robots, chemical protocells, and hybrid electronic-chemical systems. We define primary (secondary) living technology according as key material components and core systems are not (are) derived from living organisms. Primary living technology is currently emerging, distinctive, and potentially powerful, motivating this review. We trace living technology's connections with artificial life (soft, hard, and wet), synthetic biology (top-down and bottom-up), and the convergence of nano-, bio-, information, and cognitive (NBIC) technologies. We end with a brief look at the social and ethical questions generated by the prospect of living technology.


2019 ◽  
Author(s):  
Graham Keenan ◽  
Daniel Salley ◽  
Sergio Martin ◽  
Jonathan Grizou ◽  
Abhishek Sharma ◽  
...  

<p><b>The fabrication of nanomaterials from the top-down gives precise structures but it is costly, whereas bottom-up assembly methods are found by trial and error. Nature evolves materials discovery by refining and transmitting the blueprints using DNA mutations autonomously. Genetically inspired optimisation has been used in a range of applications, from catalysis to light emitting materials, but these are not autonomous, and do not use physical mutations. Here we present an autonomously driven materials-evolution robotic platform that allows us to reliably discover the conditions to produce gold-nanoparticles that can run for many cycles, discovering entirely new systems using the opto-electronic properties as a driver. Not only can we reliably discover a method, encoded digitally to synthesise these materials, we can seed in materials from preceding generations to engineer more sophisticated architectures. Over three cycles of evolution we show the seeds from each generation can produce spherical nanoparticles, rods, and highly anisotropic arrow-faceted nanoparticles. </b></p>


1984 ◽  
Vol 28 (6) ◽  
pp. 510-510
Author(s):  
Virginia A. Rappold ◽  
John L. Sibert

The purpose of this case study was to document and evaluate the application of a top-down design methodology (Foley & van Dam, 1982) to a pre-existing computer system to test the methodology's usefulness as well as to gain insights into the design process itself. System experts advocate design of a system “top-down” instead of “bottom-up” as a way to sequentially examine the complex task of interface design while allowing re-examination of previous steps in that design (Foley, 1981). The study involved a menu-based, mini-computer system designed at Goddard Space Flight Center called the Mission Planning Terminal (MPT). The MPT will be used at Goddard for planning and scheduling of satellite activities through the NASA Network Control Center (NCC). The scheduler/analyst's task includes submitting a schedule of activities for his mission, transmitting it to NCC, and then modifying the returned schedule, if necessary, using the MPT. The top-down design process is distinctly divided into four phases: conceptual, semantic, syntactic, and lexical (Foley, 1981). The first phase, conceptual, consists of defining key application concepts needed by the user. The semantic phases involve defining meanings such as information needed in order to use an object. The syntactic design defines sequences of inputs (similar to English grammar rules) and outputs (the two and three dimensional organization of the display). The last step, lexical design, describes how words in the input/output sequence are formed from the existing hardware input (Foley & van Dam, 1982). The top-down methodology was applied using MPT documentation and interviews with the designers. During this process, it became clear that although a conceptual model of the MPT existed somewhere, it was never recorded. This led to numerous attempts to extract the main conceptual components of the system from the software operations documents which were constantly changed and were often incomplete. Finally, based on preliminary screen designs, state diagrams were constructed to map out components of the system. By characterization of the MPT in this way (using state diagrams), a clearer picture emerged that finally led to understanding the conceptual model. Once the conceptual model was extracted, redesign of the system, using the top-down method, quickly followed. This case study clearly emphasizes the need for a complete and accurate conceptual model if a top-down approach is to be applied. When redesigning an existing system, it frequently becomes necessary to “extract” this model in a bottom-up manner as was the case here.


2010 ◽  
Vol 18 (3) ◽  
pp. 481-495
Author(s):  
Jonathan Cole

This paper introduces the background to the debate addressed by the papers of this Special Issue of Pragmatics & Cognition. Starting with a definition of consciousness it traces some ways in which the term is applied; from clinical medicine, where it relates somewhat crudely to responsiveness to external stimuli, to more cognitive and philosophical aspects such as higher order consciousness and its content. It then discusses the relation of consciousness to brain anatomy, the neural correlates of consciousness, and its possible evolution. In the meeting which forms the basis for Frith’s core paper, Christof Koch also made important contributions, here précised. A discussion of the origins of consciousness in relation to the top-down and bottom-up models brought to the fore follows suit.


2020 ◽  
Vol 16 (1) ◽  
pp. 1-24 ◽  
Author(s):  
Thomas M. Achenbach

Bottom-up paradigms prioritize empirical data from which to derive conceptualizations of psychopathology. These paradigms use multivariate statistics to identify syndromes of problems that tend to co-occur plus higher-order groupings such as those designated as internalizing and externalizing. Bottom-up assessment instruments obtain self-ratings and collateral ratings of behavioral, emotional, social, and thought problems and strengths for ages 1½–90+. Ratings of population samples provide norms for syndrome and higher-order scales for each gender, at different ages, rated by different informants, in relation to multicultural norms. The normed assessment instruments operationalize the empirically derived syndromes and higher-order groupings for applications to clinical services, research, and training. Because cross-informant agreement is modest and no single informant provides comprehensive assessment data, software compares ratings by different informants. Top-down paradigms prioritize conceptual representations of the nature and structure of psychopathology, as exemplified by psychodynamic, DSM/ICD, and HiTOP paradigms. Although these paradigms originated with observations, they tend to prioritize conceptual representations over empirical data.


2019 ◽  
Author(s):  
Christopher A. Brown ◽  
Ingrid Scholtes ◽  
Nicholas Shenker ◽  
Michael C. Lee

ABSTRACTIn Complex Regional Pain Syndrome (CRPS), tactile sensory deficits have motivated the therapeutic use of sensory discrimination training. However, the hierarchical organisation of the brain is such that low-level sensory processing can be dynamically influenced by higher-level knowledge, e.g. knowledge learnt from statistical regularities in the environment. It is unknown whether the learning of such statistical regularities is impaired in CRPS. Here, we employed a hierarchical Bayesian model of predictive coding to investigate statistical learning of tactile-spatial predictions in CRPS. Using a sensory change-detection task, we manipulated bottom-up (spatial displacement of a tactile stimulus) and top-down (probabilistic structure of occurrence) factors to estimate hierarchies of prediction and prediction error signals, as well as their respective precisions or reliability. Behavioural responses to spatial changes were influenced by both the magnitude of spatial displacement (bottom-up) and learnt probabilities of change (top-down). The Bayesian model revealed that patients’ predictions (of spatial displacements) were found to be less precise, deviating further from the ideal (statistical optimality) compared to healthy controls. This imprecision was less context-dependent, i.e. more enduring across changes in probabilistic context and less finely-tuned to statistics of the environment. This caused greater precision on prediction errors, resulting in predictions that were driven more by momentary spatial changes and less by the history of spatial changes. These results suggest inefficiencies in higher-order statistical learning in CRPS. This may have implications for therapies based on sensory re-training whose effects may be more short-lived if success depends on higher-order learning.


2018 ◽  
Vol 8 (5) ◽  
pp. 20180024 ◽  
Author(s):  
Tatiana Trantidou ◽  
Linda Dekker ◽  
Karen Polizzi ◽  
Oscar Ces ◽  
Yuval Elani

The design of vesicle microsystems as artificial cells (bottom-up synthetic biology) has traditionally relied on the incorporation of molecular components to impart functionality. These cell mimics have reduced capabilities compared with their engineered biological counterparts (top-down synthetic biology), as they lack the powerful metabolic and regulatory pathways associated with living systems. There is increasing scope for using whole intact cellular components as functional modules within artificial cells, as a route to increase the capabilities of artificial cells. In this feasibility study, we design and embed genetically engineered microbes ( Escherichia coli ) in a vesicle-based cell mimic and use them as biosensing modules for real-time monitoring of lactate in the external environment. Using this conceptual framework, the functionality of other microbial devices can be conferred into vesicle microsystems in the future, bridging the gap between bottom-up and top-down synthetic biology.


Synlett ◽  
2018 ◽  
Vol 29 (14) ◽  
pp. 1823-1835 ◽  
Author(s):  
Seth Herzon

Emergence is the phenomenon by which novel properties arise from the combination of simpler fragments that lack those properties at their given levels of hierarchal complexity. Emergence is a centuries-old concept that is commonly invoked in biological systems. However, the penetration of this idea into chemistry, and studies of natural products in particular, has been more limited. In this article I will describe how the perspective of emergence provided a framework to elucidate the complex properties of two classes of natural products – the diazofluorene antitumor agent lomaiviticin A and the genotoxic bacterial metabolites known as colibactins, and sets the stage for a third class of molecules – antibiotics derived from the fungal metabolite pleuromutilin. Embracing the idea of emergence helped us to connect the aggregate reactivities of the colibactins and lomaiviticin A with their biological phenotypes. Emergence is a top-down approach to natural products and complements the classical bottom-up analysis of functional group structure and reactivity. It is a useful intellectual framework to study the complex evolved properties of natural products.1 Introduction2 Diazofluorenes3 Precolibactins and Colibactins4 Pleuromutilins5 Discussion and Conclusion


Sign in / Sign up

Export Citation Format

Share Document