Resilient Self-Reproducing Systems

Author(s):  
Amor A. Menezes ◽  
Pierre T. Kabamba

This paper is motivated by the need to minimize the payload mass required to establish an extraterrestrial robotic colony. One approach for this minimization is to deploy a colony consisting of individual robots capable of self-reproducing. An important consideration once such a colony is established is its resiliency to large-scale environment or state variations. Previous approaches to learning and adaptation in self-reconfigurable robots have utilized reinforcement learning, cellular automata, and distributed control schemes to achieve robust handling of failure modes at the modular level. This work considers self-reconfigurability at the system level, where each constituent robot is endowed with a self-reproductive capacity. Rather than focus on individual dynamics, the hypothesis is that resiliency in a collective may be achieved if: 1) individual robots are free to explore all options in their decision space, including self-reproduction, and 2) they dwell preferentially on the most favorable options. Through simulations, we demonstrate that a colony operating in accordance with this hypothesis is able to adapt to changes in the external environment, respond rapidly to applied disturbances and disruptions to the internal system states, and operate in the presence of uncertainty.

Author(s):  
Andrew Reid ◽  
Julie Ballantyne

In an ideal world, assessment should be synonymous with effective learning and reflect the intricacies of the subject area. It should also be aligned with the ideals of education: to provide equitable opportunities for all students to achieve and to allow both appropriate differentiation for varied contexts and students and comparability across various contexts and students. This challenge is made more difficult in circumstances in which the contexts are highly heterogeneous, for example in the state of Queensland, Australia. Assessment in music challenges schooling systems in unique ways because teaching and learning in music are often naturally differentiated and diverse, yet assessment often calls for standardization. While each student and teacher has individual, evolving musical pathways in life, the syllabus and the system require consistency and uniformity. The challenge, then, is to provide diverse, equitable, and quality opportunities for all children to learn and achieve to the best of their abilities. This chapter discusses the designing and implementation of large-scale curriculum as experienced in secondary schools in Queensland, Australia. The experiences detailed explore the possibilities offered through externally moderated school-based assessment. Also discussed is the centrality of system-level clarity of purpose, principles and processes, and the provision of supportive networks and mechanisms to foster autonomy for a diverse range of music educators and contexts. Implications for education systems that desire diversity, equity, and quality are discussed, and the conclusion provokes further conceptualization and action on behalf of students, teachers, and the subject area of music.


Author(s):  
Miguel Ángel Hernández-Rodríguez ◽  
Ermengol Sempere-Verdú ◽  
Caterina Vicens-Caldentey ◽  
Francisca González-Rubio ◽  
Félix Miguel-García ◽  
...  

We aimed to identify and compare medication profiles in populations with polypharmacy between 2005 and 2015. We conducted a cross-sectional study using information from the Computerized Database for Pharmacoepidemiologic Studies in Primary Care (BIFAP, Spain). We estimated the prevalence of therapeutic subgroups in all individuals 15 years of age and older with polypharmacy (≥5 drugs during ≥6 months) using the Anatomical Therapeutic Chemical classification system level 4, by sex and age group, for both calendar years. The most prescribed drugs were proton-pump inhibitors (PPIs), statins, antiplatelet agents, benzodiazepine derivatives, and angiotensin-converting enzyme inhibitors. The greatest increases between 2005 and 2015 were observed in PPIs, statins, other antidepressants, and β-blockers, while the prevalence of antiepileptics was almost tripled. We observed increases in psychotropic drugs in women and cardiovascular medications in men. By patient´s age groups, there were notable increases in antipsychotics, antidepressants, and antiepileptics (15–44 years); antidepressants, PPIs, and selective β-blockers (45–64 years); selective β-blockers, biguanides, PPIs, and statins (65–79 years); and in statins, selective β-blockers, and PPIs (80 years and older). Our results revealed important increases in the use of specific therapeutic subgroups, like PPIs, statins, and psychotropic drugs, highlighting opportunities to design and implement strategies to analyze such prescriptions’ appropriateness.


2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Jingru Zhou ◽  
Yingping Zhuang ◽  
Jianye Xia

Abstract Background Genome-scale metabolic model (GSMM) is a powerful tool for the study of cellular metabolic characteristics. With the development of multi-omics measurement techniques in recent years, new methods that integrating multi-omics data into the GSMM show promising effects on the predicted results. It does not only improve the accuracy of phenotype prediction but also enhances the reliability of the model for simulating complex biochemical phenomena, which can promote theoretical breakthroughs for specific gene target identification or better understanding the cell metabolism on the system level. Results Based on the basic GSMM model iHL1210 of Aspergillus niger, we integrated large-scale enzyme kinetics and proteomics data to establish a GSMM based on enzyme constraints, termed a GEM with Enzymatic Constraints using Kinetic and Omics data (GECKO). The results show that enzyme constraints effectively improve the model’s phenotype prediction ability, and extended the model’s potential to guide target gene identification through predicting metabolic phenotype changes of A. niger by simulating gene knockout. In addition, enzyme constraints significantly reduced the solution space of the model, i.e., flux variability over 40.10% metabolic reactions were significantly reduced. The new model showed also versatility in other aspects, like estimating large-scale $$k_{{cat}}$$ k cat values, predicting the differential expression of enzymes under different growth conditions. Conclusions This study shows that incorporating enzymes’ abundance information into GSMM is very effective for improving model performance with A. niger. Enzyme-constrained model can be used as a powerful tool for predicting the metabolic phenotype of A. niger by incorporating proteome data. In the foreseeable future, with the fast development of measurement techniques, and more precise and rich proteomics quantitative data being obtained for A. niger, the enzyme-constrained GSMM model will show greater application space on the system level.


2012 ◽  
Vol 446-449 ◽  
pp. 2554-2559 ◽  
Author(s):  
Jian Jun Cai ◽  
Feng Zhang ◽  
Wei Cui ◽  
Shou Shan Chen ◽  
Pu Lun Liu

In order to effectively assess the concrete strength and deformation property under sea water erosion environment, concrete stress and strain curve was researched with the number of wet and dry cycle of 0 times, 10 times , 20 times, 30 times, 40 times, 50 times and 60 times based on the large-scale static and dynamic stiffness servo test set. The stress - strain curves of concrete was tested for the lateral pressure 10.8MPa, 14.4MPa, and 18.8MPa at different dry-wet cycles, The failure modes and superficial cracking characteristics of specimens are reported at different dry-wet cycles. Concrete elastic modulus and compressive strength were researched. Based on concrete mechanical theory , the classic Kufer-Gerstle strength criteria of concrete was used, a large number of test samples of multivariate data were nonlinear regressed, a biaxial concrete strength criterion was established taking into account the stress ratio and the number of dry-wet cycles.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 598
Author(s):  
Jean-François Pratte ◽  
Frédéric Nolet ◽  
Samuel Parent ◽  
Frédéric Vachon ◽  
Nicolas Roy ◽  
...  

Analog and digital SiPMs have revolutionized the field of radiation instrumentation by replacing both avalanche photodiodes and photomultiplier tubes in many applications. However, multiple applications require greater performance than the current SiPMs are capable of, for example timing resolution for time-of-flight positron emission tomography and time-of-flight computed tomography, and mitigation of the large output capacitance of SiPM array for large-scale time projection chambers for liquid argon and liquid xenon experiments. In this contribution, the case will be made that 3D photon-to-digital converters, also known as 3D digital SiPMs, have a potentially superior performance over analog and 2D digital SiPMs. A review of 3D photon-to-digital converters is presented along with various applications where they can make a difference, such as time-of-flight medical imaging systems and low-background experiments in noble liquids. Finally, a review of the key design choices that must be made to obtain an optimized 3D photon-to-digital converter for radiation instrumentation, more specifically the single-photon avalanche diode array, the CMOS technology, the quenching circuit, the time-to-digital converter, the digital signal processing and the system level integration, are discussed in detail.


2018 ◽  
Vol 44 (5) ◽  
pp. 354-358 ◽  
Author(s):  
Amy Paul ◽  
Maria W Merritt ◽  
Jeremy Sugarman

Ethics guidance increasingly recognises that researchers and sponsors have obligations to consider provisions for post-trial access (PTA) to interventions that are found to be beneficial in research. Yet, there is little information regarding whether and how such plans can actually be implemented. Understanding practical experiences of developing and implementing these plans is critical to both optimising their implementation and informing conceptual work related to PTA. This viewpoint is informed by experiences with developing and implementing PTA plans for six large-scale multicentre HIV prevention trials supported by the HIV Prevention Trials Network. These experiences suggest that planning and implementing PTA often involve challenges of planning under uncertainty and confronting practical barriers to accessing healthcare systems. Even in relatively favourable circumstances where a tested intervention medication is approved and available in the local healthcare system, system-level barriers can threaten the viability of PTA plans. The aggregate experience across these HIV prevention trials suggests that simply referring participants to local healthcare systems for PTA will not necessarily result in continued access to beneficial interventions for trial participants. Serious commitments to PTA will require additional efforts to learn from future approaches, measuring the success of PTA plans with dedicated follow-up and further developing normative guidance to help research stakeholders navigate the complex practical challenges of realising PTA.


2021 ◽  
Author(s):  
Hyeyoung Koh ◽  
Hannah Beth Blum

This study presents a machine learning-based approach for sensitivity analysis to examine how parameters affect a given structural response while accounting for uncertainty. Reliability-based sensitivity analysis involves repeated evaluations of the performance function incorporating uncertainties to estimate the influence of a model parameter, which can lead to prohibitive computational costs. This challenge is exacerbated for large-scale engineering problems which often carry a large quantity of uncertain parameters. The proposed approach is based on feature selection algorithms that rank feature importance and remove redundant predictors during model development which improve model generality and training performance by focusing only on the significant features. The approach allows performing sensitivity analysis of structural systems by providing feature rankings with reduced computational effort. The proposed approach is demonstrated with two designs of a two-bay, two-story planar steel frame with different failure modes: inelastic instability of a single member and progressive yielding. The feature variables in the data are uncertainties including material yield strength, Young’s modulus, frame sway imperfection, and residual stress. The Monte Carlo sampling method is utilized to generate random realizations of the frames from published distributions of the feature parameters, and the response variable is the frame ultimate strength obtained from finite element analyses. Decision trees are trained to identify important features. Feature rankings are derived by four feature selection techniques including impurity-based, permutation, SHAP, and Spearman's correlation. Predictive performance of the model including the important features are discussed using the evaluation metric for imbalanced datasets, Matthews correlation coefficient. Finally, the results are compared with those from reliability-based sensitivity analysis on the same example frames to show the validity of the feature selection approach. As the proposed machine learning-based approach produces the same results as the reliability-based sensitivity analysis with improved computational efficiency and accuracy, it could be extended to other structural systems.


2019 ◽  
Author(s):  
Alvin Vista

Cheating detection is an important issue in standardized testing, especially in large-scale settings. Statistical approaches are often computationally intensive and require specialised software to conduct. We present a two-stage approach that quickly filters suspected groups using statistical testing on an IRT-based answer-copying index. We also present an approach to mitigate data contamination and improve the performance of the index. The computation of the index was implemented through a modified version of an open source R package, thus enabling wider access to the method. Using data from PIRLS 2011 (N=64,232) we conduct a simulation to demonstrate our approach. Type I error was well-controlled and no control group was falsely flagged for cheating, while 16 (combined n=12,569) of the 18 (combined n=14,149) simulated groups were detected. Implications for system-level cheating detection and further improvements of the approach were discussed.


Sign in / Sign up

Export Citation Format

Share Document