scholarly journals Generating minimal living systems from non-living materials and increasing their evolutionary abilities

2016 ◽  
Vol 371 (1701) ◽  
pp. 20150440 ◽  
Author(s):  
Steen Rasmussen ◽  
Adi Constantinescu ◽  
Carsten Svaneborg

We review lessons learned about evolutionary transitions from a bottom-up construction of minimal life. We use a particular systemic protocell design process as a starting point for exploring two fundamental questions: (i) how may minimal living systems emerge from non-living materials? and (ii) how may minimal living systems support increasingly more evolutionary richness? Under (i), we present what has been accomplished so far and discuss the remaining open challenges and their possible solutions. Under (ii), we present a design principle we have used successfully both for our computational and experimental protocellular investigations, and we conjecture how this design principle can be extended for enhancing the evolutionary potential for a wide range of systems. This article is part of the themed issue ‘The major synthetic evolutionary transitions’.

2021 ◽  
Author(s):  
Farhan Ali ◽  

Thinking creatively, is a necessary condition of the Design process to transform ideas into novel solutions and break barriers to creativity. Although, there are many techniques and ways to stimulate creative thinking for designers, however, this research paper adopts SCAMPER; which is acronym of: Substitute- Combine-Adapt- Modify or Magnify-Put to another use-Eliminate-Reverse or Rearrange- to integrate the sustainability concepts within architectural design process. Many creative artifacts have been designed consciously or unconsciously adopting SCAMPER strategies such as rehabilitation and reuse projects to improve the functional performance or the aesthetic sense of an existing building for the better. SCAMPER is recognized as a divergent thinking tool are used during the initial ideation stage, aims to leave the usual way of thinking to generate a wide range of new ideas that will lead to new insights, original ideas, and creative solutions to problems. The research focuses on applying this method in the architectural design, which is rarely researched, through reviewing seven examples that have been designed consciously or unconsciously adopting SCAMPER mnemonic techniques. The paper aims to establish a starting point for further research to deepen it and study its potentials in solving architectural design problems.


Author(s):  
Wayne Walter ◽  
Edward Hensel

During academic year 2006–07, a family of four closely related multi-disciplinary senior design projects was initiated. Each project team consisted of eight undergraduate students. The family of projects has continued during the academic year 2007–08, with three additional design projects comprised of 19 students. The intent of the family of design projects is two-fold. The first objective is to introduce students to the concept of designing a product within the context of a family of closely related products, similar to the approach that a corporation may use in its strategic approach to the marketplace. The second objective is to provide an open-source, open-architecture, modular, and scalable robotic vehicle platform usable by a wide range of researchers within the Kate Gleason College of Engineering looking for a vehicle to position cameras, sensors in networks, and for other data-gathering tasks. Students were given the challenge to design and manufacture a platform based on a single design, scalable across four payload orders of magnitude from 1kg to 1,000kg. The 10kg and 100kg variants were studied in AY2006–07, and the 1kg variant was introduced in AY2007–08. The largest, 1,000kg, planned for the future, will be about the size of a Honda Civic, so safety and fail-safe engineering is important. Each project in the family is expected to build on the technology used and lessons learned from prior and concurrent projects, much like the “next model year” in the auto industry, and information sharing requirements among concurrent engineering teams. Hardware, software, and design methods are reused whenever possible, and students are expected to develop their subsystem in the context of an evolutionary platform design. In this manner, the end-product from one design group becomes the starting point for another team. Responsibilities overlap so teams must work cooperatively, which mimics the industrial environment. Starting times on various projects may be staggered, and students must deal with documentation sharing issues, and preservation of design intent across multiple-project teams and academic terms. The paper will discuss the current status of the program, the lessons learned to-date, and future plans for the program.


2021 ◽  
Author(s):  
◽  
Gijs Jan Molenaar

The preparation for the construction of the Square Kilometre Array, and the introduction of its operational precursors, such as LOFAR and MeerKAT, mark the beginning of an exciting era for astronomy. Impressive new data containing valuable science just waiting for discovery is already being generated, and these devices will produce far more data than has ever been collected before. However, with every new data instrument, the data rates grow to unprecedented quantities of data, requiring novel new data-processing tools. In addition, creating science grade data from the raw data still requires significant expert knowledge for processing this data. The software used is often developed by a scientist who lacks proper training in software development skills, resulting in the software not progressing beyond a prototype stage in quality. In the first chapter, we explore various organisational and technical approaches to address these issues by providing a historical overview of the development of radioastronomy pipelines since the inception of the field in the 1940s. In that, the steps required to create a radio image are investigated. We used the lessons-learned to identify patterns in the challenges experienced, and the solutions created to address these over the years. The second chapter describes the mathematical foundations that are essential for radio imaging. In the third chapter, we discuss the production of the KERN Linux distribution, which is a set of software packages containing most radio astronomy software currently in use. Considerable effort was put into making sure that the contained software installs appropriately, all items next to one other on the same system. Where required and possible, bugs and portability fixes were solved and reported with the upstream maintainers. The KERN project also has a website, and issue tracker, where users can report bugs and maintainers can coordinate the packaging effort and new releases. The software packages can be used inside Docker and Singularity containers, enabling the installation of these packages on a wide variety of platforms. In the fourth and fifth chapters, we discuss methods and frameworks for combining the available data reduction tools into recomposable pipelines and introduce the Kliko specification and software. This framework was created to enable end-user astronomers to chain and containerise operations of software in KERN packages. Next, we discuss the Common Workflow Language (CommonWL), a similar but more advanced and mature pipeline framework invented by bio-informatics scientists. CommonWL is supported by a wide range of tools already; among other schedulers, visualisers and editors. Consequently, when a pipeline is made with CommonWL, it can be deployed and manipulated with a wide range of tools. In the final chapter, we attempt something unconventional, applying a generative adversarial network based on deep learning techniques to perform the task of sky brightness reconstruction. Since deep learning methods often require a large number of training samples, we constructed a CommonWL simulation pipeline for creating dirty images and corresponding sky models. This simulated dataset has been made publicly available as the ASTRODECONV2019 dataset. It is shown that this method is useful to perform the restoration and matches the performance of a single clean cycle. In addition, we incorporated domain knowledge by adding the point spread function to the network and by utilising a custom loss function during training. Although it was not possible to improve the cleaning performance of commonly used existing tools, the computational time performance of the approach looks very promising. We suggest that a smaller scope should be the starting point for further studies and optimising of the training of the neural network could produce the desired results.


2020 ◽  
Author(s):  
Eleonora Diamanti ◽  
Inda Setyawati ◽  
Spyridon Bousis ◽  
leticia mojas ◽  
lotteke Swier ◽  
...  

Here, we report on the virtual screening, design, synthesis and structure–activity relationships (SARs) of the first class of selective, antibacterial agents against the energy-coupling factor (ECF) transporters. The ECF transporters are a family of transmembrane proteins involved in the uptake of vitamins in a wide range of bacteria. Inhibition of the activity of these proteins could reduce the viability of pathogens that depend on vitamin uptake. Because of their central role in the metabolism of bacteria and their absence in humans, ECF transporters are novel potential antimicrobial targets to tackle infection. The hit compound’s metabolic and plasma stability, the potency (20, MIC Streptococcus pneumoniae = 2 µg/mL), the absence of cytotoxicity and a lack of resistance development under the conditions tested here suggest that this scaffold may represent a promising starting point for the development of novel antimicrobial agents with an unprecedented mechanism of action.<br>


2021 ◽  
Vol 13 (3) ◽  
pp. 1589
Author(s):  
Juan Sánchez-Fernández ◽  
Luis-Alberto Casado-Aranda ◽  
Ana-Belén Bastidas-Manzano

The limitations of self-report techniques (i.e., questionnaires or surveys) in measuring consumer response to advertising stimuli have necessitated more objective and accurate tools from the fields of neuroscience and psychology for the study of consumer behavior, resulting in the creation of consumer neuroscience. This recent marketing sub-field stems from a wide range of disciplines and applies multiple types of techniques to diverse advertising subdomains (e.g., advertising constructs, media elements, or prediction strategies). Due to its complex nature and continuous growth, this area of research calls for a clear understanding of its evolution, current scope, and potential domains in the field of advertising. Thus, this current research is among the first to apply a bibliometric approach to clarify the main research streams analyzing advertising persuasion using neuroimaging. Particularly, this paper combines a comprehensive review with performance analysis tools of 203 papers published between 1986 and 2019 in outlets indexed by the ISI Web of Science database. Our findings describe the research tools, journals, and themes that are worth considering in future research. The current study also provides an agenda for future research and therefore constitutes a starting point for advertising academics and professionals intending to use neuroimaging techniques.


Author(s):  
David Callaway ◽  
Jeff Runge ◽  
Lucia Mullen ◽  
Lisa Rentz ◽  
Kevin Staley ◽  
...  

Abstract The United States Centers for Disease Control and Prevention and the World Health Organization broadly categorize mass gathering events as high risk for amplification of coronavirus disease 2019 (COVID-19) spread in a community due to the nature of respiratory diseases and the transmission dynamics. However, various measures and modifications can be put in place to limit or reduce the risk of further spread of COVID-19 for the mass gathering. During this pandemic, the Johns Hopkins University Center for Health Security produced a risk assessment and mitigation tool for decision-makers to assess SARS-CoV-2 transmission risks that may arise as organizations and businesses hold mass gatherings or increase business operations: The JHU Operational Toolkit for Businesses Considering Reopening or Expanding Operations in COVID-19 (Toolkit). This article describes the deployment of a data-informed, risk-reduction strategy that protects local communities, preserves local health-care capacity, and supports democratic processes through the safe execution of the Republican National Convention in Charlotte, North Carolina. The successful use of the Toolkit and the lessons learned from this experience are applicable in a wide range of public health settings, including school reopening, expansion of public services, and even resumption of health-care delivery.


2021 ◽  
Vol 13 (8) ◽  
pp. 4492
Author(s):  
Janka Saderova ◽  
Andrea Rosova ◽  
Marian Sofranko ◽  
Peter Kacmary

The warehouse process, as one of many logistics processes, currently holds an irreplaceable position in logistics systems in companies and in the supply chain. The proper function of warehouse operations depends on, among other things, the type of the used technology and their utilization. The research in this article is focused on the design of a warehouse system. The selection of a suitable warehouse system is a current research topic as the warehouse system has an impact on warehouse capacity and utilization and on the speed of storage activities. The paper presents warehouse system design methodology that was designed applying the logistics principle-systematic (system) approach. The starting point for designing a warehouse system represents of the process of design logistics systems. The design process consists of several phases: project identification, design process paradigm selection, system analysis, synthesis, and project evaluation. This article’s contribution is the proposed methodology and design of the warehouse system for the specified conditions. The methodology was implemented for the design of a warehouse system in a cold box, which is a part of a distribution warehouse. The technology of pallet racking was chosen in the warehouse to store pallets. Pallets will be stored and removed by forklifts. For the specified conditions, the warehouse system was designed for two alternatives of racking assemblies, which are served by forklifts. Alternative 1—Standard pallet rack with wide aisles and Alternative 2—Pallet dynamic flow rack. The proposed systems were compared on the basis of selected indicators: Capacity—the number of pallet places in the system, Percentage ratio of storage area from the box area, Percentage ratio of handling aisles from the box area, Access to individual pallets by forklift, Investment costs for 1 pallet space in EUR. Based on the multicriteria evaluation, the Alternative 2 was chosen as the acceptable design of the warehouse system with storage capacity 720 pallet units. The system needs only two handling aisles. Loading and unloading processes are separate from each other, which means that there are no collisions with forklifts. The pallets with the goods are operated on the principle of FIFO (first in, first out), which will facilitate the control of the shelf life of batches or series of products. The methodology is a suitable tool for decision-making in selecting and designing a warehouse system.


2019 ◽  
Vol 35 (8) ◽  
pp. 879-915 ◽  
Author(s):  
Bona Lu ◽  
Yan Niu ◽  
Feiguo Chen ◽  
Nouman Ahmad ◽  
Wei Wang ◽  
...  

Abstract Gas-solid fluidization is intrinsically dynamic and manifests mesoscale structures spanning a wide range of length and timescales. When involved with reactions, more complex phenomena emerge and thus pose bigger challenges for modeling. As the mesoscale is critical to understand multiphase reactive flows, which the conventional two-fluid model without mesoscale modeling may be inadequate to resolve even using extremely fine grids, this review attempts to demonstrate that the energy-minimization multiscale (EMMS) model could be a starting point to develop such mesoscale modeling. Then, the EMMS-based mesoscale modeling with emphasis on formulation of drag coefficients for different fluidization regimes, modification of mass transfer coefficient, and other extensions are discussed in an attempt to resolve the emerging challenges. Its applications with examples of development of novel fluid catalytic cracking and methanol-to-olefins processes prove that the mesoscale modeling plays a remarkable role in improving the predictions in hydrodynamic behaviors and overall reaction rate. However, the product content primarily depends on the chemical kinetic model itself, suggesting the necessity of an effective coupling between chemical kinetics and flow characteristics. The mesoscale modeling can be believed to accelerate the traditional experimental-based scale-up process with much lower cost in the future.


2002 ◽  
Vol 11 (3) ◽  
pp. 096369350201100
Author(s):  
E.M. Gravel ◽  
T.D. Papathanasiou

Dual porosity fibrous media are important in a number of applications, ranging from bioreactor design and transport in living systems to composites manufacturing. In the present study we are concerned with the development of predictive models for the hydraulic permeability ( Kp) of various arrays of fibre bundles. For this we carry out extensive computations for viscous flow through arrays of fibre bundles using the Boundary Element Method (BEM) implemented on a multi-processor computer. Up to 350 individual filaments, arranged in square or hexagonal packing within bundles, which are also arranged in square of hexagonal packing, are included in each simulation. These are simple but not trivial models for fibrous preforms used in composites manufacturing – dual porosity systems characterised by different inter- and intra-tow porosities. The way these porosities affect the hydraulic permeability of such media is currently unknown and is elucidated through our simulations. Following numerical solution of the governing equations, ( Kp) is calculated from the computed flowrate through Darcy's law and is expressed as function of the inter- and intra-tow porosities (φ, φt) and of the filament radius ( Rf). Numerical results are also compared to analytical models. The latter form the starting point in the development of a dimensionless correlation for the permeability of such dual porosity media. It is found that the numerically computed permeabilities follow that correlation for a wide range of φ i, φt and Rf.


2015 ◽  
Vol 14 (4) ◽  
pp. 118-123 ◽  
Author(s):  
Lauren Trees

Purpose – The purpose of this paper is to present enterprise social networking and gamification as two potential tools to help organizations engage Millennial employees in collaboration and learning. Design/methodology/approach – The research provides general descriptions of enterprise social networking and gamification approaches, shares data on adoption of these approaches from APQC’s “2015 Knowledge Management Priorities Data Report” (based on a January 2015 survey of 524 knowledge management professionals) and includes four company examples adapted from APQC’s Connecting People to Content and Transferring and Applying Critical Knowledge best practices studies. The methodology for APQC’s best practices studies involves screening 50 or more organizations with potential best practices in a given research scope area and identifying five or six with proven best practices. APQC then conducts detailed site visits with the selected organizations and publishes case studies based on those site visits. Findings – Enterprise social networking platforms are in place at 50 per cent of organizations, with another 25 per cent planning to implement them by the end of 2015. By providing near-immediate access to information and answers, enterprise social networking helps Millennials learn the ropes at their new workplaces, gives them direct access to more knowledgeable colleagues who can assist and mentor them, and helps them improve their business outcomes by reusing knowledge and lessons learned across projects. Younger workers can also harness the power of social networking to create a sense of belonging and build their reputations in large, dispersed firms, where it is particularly difficult for them to gain visibility. A recent APQC survey indicates that 54 per cent of organizations either currently employ gamification to encourage collaboration or expect to implement it within the next three years. The rush to gamify the enterprise is, at least in part, a reflection of employers’ desire to satisfy Millennials and make them feel connected to a community of co-workers. Although games appeal to a wide range of age groups, Millennials grew up with digital interaction and tend to prefer environments that emphasize teamwork, social learning and frequent feedback – all of which can be delivered through gamification. Originality/value – The value of this paper is to introduce the value of and relationship between enterprise social networking and gamification platforms to human resource (HR) professionals looking to increase engagement and retention rates for Millennial employees.


Sign in / Sign up

Export Citation Format

Share Document