scholarly journals Nothing in evolution makes sense except in the light of parasitism: evolution of complex replication strategies

2021 ◽  
Vol 8 (8) ◽  
pp. 210441
Author(s):  
Simon J. Hickinbotham ◽  
Susan Stepney ◽  
Paulien Hogeweg

Parasitism emerges readily in models and laboratory experiments of RNA world and would lead to extinction unless prevented by compartmentalization or spatial patterning. Modelling replication as an active computational process opens up many degrees of freedom that are exploited to meet environmental challenges, and to modify the evolutionary process itself. Here, we use automata chemistry models and spatial RNA-world models to study the emergence of parasitism and the complexity that evolves in response. The system is initialized with a hand-designed replicator that copies other replicators with a small chance of point mutation. Almost immediately, short parasites arise; these are copied more quickly, and so have an evolutionary advantage. The replicators also become shorter, and so are replicated faster; they evolve a mechanism to slow down replication, which reduces the difference of replication rate of replicators and parasites. They also evolve explicit mechanisms to discriminate copies of self from parasites; these mechanisms become increasingly complex. New parasite species continually arise from mutated replicators, rather than from evolving parasite lineages. Evolution itself evolves, e.g. by effectively increasing point mutation rates, and by generating novel emergent mutational operators. Thus, parasitism drives the evolution of complex replicators and complex ecosystems.

2021 ◽  
Author(s):  
Simon J. Hickinbotham ◽  
Susan Stepney ◽  
Paulien Hogeweg

AbstractThe emergence of parasites in evolving replicating systems appears to be inevitable. Parasites emerge readily in models and laboratory experiments of the hypothesised earliest replicating systems: the RNA world. Phylogenetic reconstructions also suggest very early evolution of viruses and other parasitic mobile genetic elements in our biosphere. The evolution of such parasites would lead to extinction unless prevented by compartmentalisation or spatial pattern formation, and the emergence of multilevel selection. Today and apparently since the earliest times, many intricate defence and counter-defence strategies have evolved. Here we bring together for the first time automata chemistry models and spatial RNA world models, to study the emergence of parasites and the evolving complexity to cope with the parasites. Our system is initialised with a hand-designed program string that copies other program strings one character at a time, with a small chance of point mutation. Almost immediately, short parasites arise; these are copied more quickly, and so have an evolutionary advantage. Spatial pattern formation, in the form of chaotic waves of replicators followed by parasites, can prevent extinction. The replicators also become shorter, and so are replicated faster. They evolve a mechanism to slow down replication, which reduces the difference of replication rate of replicators and parasites. They also evolve explicit mechanisms to discriminate copies of self from parasites; these mechanisms become increasingly complex. Replicators speciate into lineages and can become longer, despite the fitness cost that entails. We do not see a classical co-evolutionary arms-race of a replicator and a parasite lineage: instead new parasite species continually arise from mutated replicators, rather than from evolving parasite lineages. Finally we note that evolution itself evolves, for example by effectively increasing point mutation rates, and by generating novel emergent mutational operators. The inevitable emergence of parasites in replicator systems drives the evolution of complex replicators and complex ecosystems with high population density. Even in the absence of parasites, the evolved replicators outperform the initial replicator and the early short replicators. Modelling replication as an active computational process opens up many degrees of freedom that are exploited not only to meet environmental challenges, but also to modify the evolutionary process itself.


2019 ◽  
Author(s):  
Riccardo Spezia ◽  
Hichem Dammak

<div> <div> <div> <p>In the present work we have investigated the possibility of using the Quantum Thermal Bath (QTB) method in molecular simulations of unimolecular dissociation processes. Notably, QTB is aimed in introducing quantum nuclear effects with a com- putational time which is basically the same as in newtonian simulations. At this end we have considered the model fragmentation of CH4 for which an analytical function is present in the literature. Moreover, based on the same model a microcanonical algorithm which monitor zero-point energy of products, and eventually modifies tra- jectories, was recently proposed. We have thus compared classical and quantum rate constant with these different models. QTB seems to correctly reproduce some quantum features, in particular the difference between classical and quantum activation energies, making it a promising method to study unimolecular fragmentation of much complex systems with molecular simulations. The role of QTB thermostat on rotational degrees of freedom is also analyzed and discussed. </p> </div> </div> </div>


Author(s):  
Massimiliano Di Ventra

This chapter expands on the previous one on the role of experiments in Science. It explains the difference between observations of phenomena and controlled laboratory experiments.


2015 ◽  
Vol 35 (4) ◽  
pp. 341-347 ◽  
Author(s):  
E. Rouhani ◽  
M. J. Nategh

Purpose – The purpose of this paper is to study the workspace and dexterity of a microhexapod which is a 6-degrees of freedom (DOF) parallel compliant manipulator, and also to investigate its dimensional synthesis to maximize the workspace and the global dexterity index at the same time. Microassembly is so essential in the current industry for manufacturing complicated structures. Most of the micromanipulators suffer from their restricted workspace because of using flexure joints compared to the conventional ones. In addition, the controllability of micromanipulators inside the whole workspace is very vital. Thus, it is very important to select the design parameters in a way that not only maximize the workspace but also its global dexterity index. Design/methodology/approach – Microassembly is so essential in the current industry for manufacturing complicated structures. Most of the micromanipulators suffer from their restricted workspace because of using flexure joints compared to the conventional ones. In addition, the controllability of micromanipulators inside the whole workspace is very vital. Thus, it is very important to select the design parameters in a way that not only maximize the workspace but also its global dexterity index. Findings – It has been shown that the proposed procedure for the workspace calculation can considerably speed the required calculations. The optimization results show that a converged-diverged configuration of pods and an increase in the difference between the moving and the stationary platforms’ radii cause the global dexterity index to increase and the workspace to decrease. Originality/value – The proposed algorithm for the workspace analysis is very important, especially when it is an objective function of an optimization problem based on the search method. In addition, using screw theory can simply construct the homogeneous Jacobian matrix. The proposed methodology can be used for any other micromanipulator.


1989 ◽  
Vol 67 (8) ◽  
pp. 2078-2080 ◽  
Author(s):  
Robert Poulin ◽  
Gerard J. FitzGerald

Females of the ectoparasitic crustacean Argulus canadensis must leave their fish hosts at least temporarily to deposit their eggs on the substrate. To test the hypothesis that this difference in reproductive behaviour between the two sexes could result in male-biased sex ratios on their stickleback hosts, we sampled sticklebacks in tide pools of a Quebec salt marsh from early July to early September 1986. During this period, fish harboured significantly more male than female A. canadensis. Laboratory experiments were done to test two alternative hypotheses offered to explain this biased sex ratio. The first hypothesis was that male A. canadensis were more successful than females in attacking their stickleback hosts; however, we found no differences in attack success on their hosts between the two parasite sexes. The second hypothesis was that sticklebacks ate more female than male A. canadensis. Although males were less vulnerable to fish predation than females, the difference was not significant. We conclude that sexual differences in reproductive behaviour, i.e., egg deposition behaviour of females, can account for the male-biased sex ratio of A. canadensis on sticklebacks.


Actuators ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 9
Author(s):  
Taehoon Lee ◽  
Inwoo Kim ◽  
Yoon Su Baek

Lower limb exoskeleton robots help with walking movements through mechanical force, by identifying the wearer’s walking intention. When the exoskeleton robot is lightweight and comfortable to wear, the stability of walking increases, and energy can be used efficiently. However, because it is difficult to implement the complex anatomical movements of the human body, most are designed simply. Due to this, misalignment between the human and robot movement causes the wearer to feel uncomfortable, and the stability of walking is reduced. In this paper, we developed a two degrees of freedom (2DoF) ankle exoskeleton robot with a subtalar joint and a talocrural joint, applying a four-bar linkage to realize the anatomical movement of a simple 1DoF structure mainly used for ankles. However, bidirectional tendon-driven actuators (BTDAs) do not consider the difference in a length change of both cables due to dorsiflexion (DF) and plantar flexion (PF) during walking, causing misalignment. To solve this problem, a BTDA was developed by considering the length change of both cables. Cable-driven actuators and exoskeleton robot systems create uncertainty. Accordingly, adaptive control was performed with a proportional-integral-differential neural network (PIDNN) controller to minimize system uncertainty.


Author(s):  
Abigail Niesen ◽  
Anna L Garverick ◽  
Maury Hull

Abstract Maximum total point motion (MTPM), the point on a baseplate that migrates the most, has been used to assess the risk of tibial baseplate loosening using radiostereometric analysis (RSA). Two methods for determining MTPM for model-based RSA are to use either 5 points distributed around the perimeter of the baseplate or to use all points on the 3D model. The objectives were to quantify the mean difference in MTPM using 5 points vs. all points, compute the percent error relative to the 6-month stability limit for groups of patients, and to determine the dependency of differences in MTPM on baseplate size and shape. A dataset of 10,000 migration values was generated using the mean and standard deviation of migration in six degrees of freedom at 6 months from an RSA study. The dataset was used to simulate migration of 3D models (two baseplate shapes and two baseplate sizes) and calculate the difference in MTPM using 5 virtual points vs. all points and the percent error (i.e. difference in MTPM/stability limit) relative to the 6-month stability limit. The difference in MTPM was about 0.02 mm, or 4% percent relative to the 6-month stability limit, which is not clinically important. Furthermore, results were not affected by baseplate shape or size. Researchers can decide whether to use 5 points or all points when computing MTPM for model-based RSA. The authors recommend using 5 points to maintain consistency with marker-based RSA.


Author(s):  
Vincent Delos ◽  
Santiago Arroyave-Tobón ◽  
Denis Teissandier

In mechanical design, tolerance zones and contact gaps can be represented by sets of geometric constraints. For computing the accumulation of possible manufacturing defects, these sets have to be summed and/or intersected according to the assembly architecture. The advantage of this approach is its robustness for treating even over-constrained mechanisms i.e. mechanisms in which some degrees of freedom are suppressed in a redundant way. However, the sum of constraints, which must be computed when simulating the accumulation of defects in serial joints, is a very time-consuming operation. In this work, we compare three methods for summing sets of constraints using polyhedral objects. The difference between them lie in the way the degrees of freedom (DOFs) (or invariance) of joints and features are treated. The first method proposes to virtually limit the DOFs of the toleranced features and joints to turn the polyhedra into polytopes and avoid manipulating unbounded objects. Even though this approach enables to sum, it also introduces bounding or cap facets which increase the complexity of the operand sets. This complexity increases after each operation until becoming far too significant. The second method aims to face this problem by cleaning, after each sum, the calculated polytope to keep under control the effects of the propagation of the DOFs. The third method is new and based on the identification of the sub-space in which the projection of the operands are bounded sets. Calculating the sum in this sub-space allows reducing significantly the operands complexity and consequently the computational time. After presenting the geometric properties on which the approaches rely, we demonstrate them on an industrial case. Then we compare the computation times and deduce the equality of the results of all the methods.


Author(s):  
Hanqing Lu ◽  
Xinwen Hou ◽  
Cheng-Lin Liu ◽  
Xiaolin Chen

Insect recognition is a hard problem because the difference of appearance between insects is so small that only some entomologist experts can distinguish them. Besides that, insects are often composed of several parts (multiple views) which generate more degrees of freedom. This chapter proposes several discriminative coding approaches and one decision fusion scheme of heterogeneous class sets for insect recognition. The three discriminative coding methods use class specific concatenated vectors instead of traditional global coding vectors for insect image patches. The decision fusion scheme uses an allocation matrix for classifier selection and a weight matrix for classifier fusion, which is suitable for combining classifiers of heterogeneous class sets in multi-view insect image recognition. Experimental results on a Tephritidae dataset show that the three proposed discriminative coding methods perform well in insect recognition, and the proposed fusion scheme improves the recognition accuracy significantly.


2019 ◽  
Vol 11 (7) ◽  
pp. 746 ◽  
Author(s):  
Feng Xu ◽  
David Diner ◽  
Oleg Dubovik ◽  
Yoav Schechner

Aerosol retrieval algorithms used in conjunction with remote sensing are subject to ill-posedness. To mitigate non-uniqueness, extra constraints (in addition to observations) are valuable for stabilizing the inversion process. This paper focuses on the imposition of an empirical correlation constraint on the retrieved aerosol parameters. This constraint reflects the empirical dependency between different aerosol parameters, thereby reducing the number of degrees of freedom and enabling accelerated computation of the radiation fields associated with neighboring pixels. A cross-pixel constraint that capitalizes on the smooth spatial variations of aerosol properties was built into the original multi-pixel inversion approach. Here, the spatial smoothness condition is imposed on principal components (PCs) of the aerosol model, and on the corresponding PC weights, where the PCs are used to characterize departures from the mean. Mutual orthogonality and unit length of the PC vectors, as well as zero sum of the PC weights also impose stabilizing constraints on the retrieval. Capitalizing on the dependencies among aerosol parameters and the mutual orthogonality of PCs, a perturbation-based radiative transfer computation scheme is developed. It uses a few dominant PCs to capture the difference in the radiation fields across an imaged area. The approach is tested using 27 observations acquired by the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) during multiple NASA field campaigns and validated using collocated AERONET observations. In particular, aerosol optical depth, single scattering albedo, aerosol size, and refractive index are compared with AERONET aerosol reference data. Retrieval uncertainty is formulated by accounting for both instrumental errors and the effects of multiple types of constraints.


Sign in / Sign up

Export Citation Format

Share Document