scholarly journals The Concept of Cooperative Dynamics in Simulations of Soft Matter

2020 ◽  
Vol 8 ◽  
Author(s):  
Piotr Polanowski ◽  
Andrzej Sikorski

In this review we compiled recent advances concerning the cooperative motion in crowded soft matter systems. We tried to answer the question how to perform dynamic Monte Carlo simulations of dense macromolecular systems effectively. This problem is not simple due to the fact that the movement in such systems is strictly correlated which leads to cooperative phenomena. The influence of crowding was found interesting especially for two-dimensional cases, e.g., in membranes where the presence of macromolecules, proteins and cytoskeleton often changed the mean-square displacement as a function of the lag time and anomalous diffusion appeared. Simple models are frequently used to shed a light on molecular transport in biological systems. The emphasis was given to the Dynamic Lattice Liquid model. The latter model became a basis for a parallel algorithm that takes into account coincidences of elementary molecular motion attempts resulting in local cooperative structural transformations. The emphasis is put on influence of the model of molecular transport on the diffusion. The comparison to alternative approaches like single agent model was carried out.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nikita P. Kryuchkov ◽  
Nikita A. Dmitryuk ◽  
Wei Li ◽  
Pavel V. Ovcharov ◽  
Yilong Han ◽  
...  

AbstractMelting is one of the most studied phase transitions important for atomic, molecular, colloidal, and protein systems. However, there is currently no microscopic experimentally accessible criteria that can be used to reliably track a system evolution across the transition, while providing insights into melting nucleation and melting front evolution. To address this, we developed a theoretical mean-field framework with the normalised mean-square displacement between particles in neighbouring Voronoi cells serving as the local order parameter, measurable experimentally. We tested the framework in a number of colloidal and in silico particle-resolved experiments against systems with significantly different (Brownian and Newtonian) dynamic regimes and found that it provides excellent description of system evolution across melting point. This new approach suggests a broad scope for application in diverse areas of science from materials through to biology and beyond. Consequently, the results of this work provide a new guidance for nucleation theory of melting and are of broad interest in condensed matter, chemical physics, physical chemistry, materials science, and soft matter.


2021 ◽  
Author(s):  
Adith S Arun ◽  
Sung-Cheol Kim ◽  
Mehmet Eren Ahsen ◽  
Gustavo A Stolovitzky

Identifying and characterizing the effect of combination cancer therapies is of paramount importance in cancer research. The benefit of a combination can either be due to inherent heterogeneity in patient populations or because of molecular synergy between the compounds given in combination, usually studied in cell culture, or both. To shed light and help characterized combinations and their enhanced benefits over single therapies, we introduce Correlated Drug Action (CDA) as a baseline additivity model. We formulate the CDA model as a closed-form expression, which lends itself to be scalable and interpretable, both in the temporal domain (tCDA) to explain survival curves, and in the dose domain (dCDA), to explain dose-response curves. CDA can be used in clinical trials and cell culture experiments. At the level of clinical trials, we demonstrate tCDA's utility in explaining the benefit of clinical combinations, identifying non-additive combinations, and cases where biomarkers may be able to decouple the combination into monotherapies. At the level of cells in culture, dCDA naturally embodies null models such as Bliss additivity and the Highest Single Agent model as special cases, and can be extended to be sham combination compliant. We demonstrate the applicability of dCDA in assessing non-additive combinations and doses. Additionally, we introduce a new synergy metric, Excess over CDA (EOCDA), that incorporates elements of Bliss additivity and dose equivalence concepts in the same measure. CDA is a novel general framework for additivity at the cell line and patient population levels and provides a method to characterize and quantify the action of drug combinations.


2021 ◽  
Vol 35 (3) ◽  
pp. 175-192
Author(s):  
Maximilian Kasy

A key challenge for interpreting published empirical research is the fact that published findings might be selected by researchers or by journals. Selection might be based on criteria such as significance, consistency with theory, or the surprisingness of findings or their plausibility. Selection leads to biased estimates, reduced coverage of confidence intervals, and distorted posterior beliefs. I review methods for detecting and quantifying selection based on the distribution of p-values, systematic replication studies, and meta-studies. I then discuss the conflicting recommendations regarding selection result ing from alternative objectives, in particular, the validity of inference versus the relevance of findings for decision-makers. Based on this discussion, I consider various reform proposals, such as deemphasizing significance, pre-analysis plans, journals for null results and replication studies, and a functionally differentiated publication system. In conclusion, I argue that we need alternative foundations of statistics that go beyond the single-agent model of decision theory.


Polymers ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 57
Author(s):  
Cristina Pérez-Fernández ◽  
Pilar Valles ◽  
Elena González-Toril ◽  
Eva Mateo-Martí ◽  
José Luis de la Fuente ◽  
...  

A systematic study is presented to explore the NH4CN polymerization induced by microwave (MW) radiation, keeping in mind the recent growing interest in these polymers in material science. Thus, a first approach through two series, varying the reaction times and the temperatures between 130 and 205 °C, was conducted. As a relevant outcome, using particular reaction conditions, polymer conversions similar to those obtained by means of conventional thermal methods were achieved, with the advantage of a very significant reduction of the reaction times. The structural properties of the end products were evaluated using compositional data, spectroscopic measurements, simultaneous thermal analysis (STA), X-ray diffraction (XRD), and scanning electron microscopy (SEM). As a result, based on the principal component analysis (PCA) from the main experimental results collected, practically only the crystallographic features and the morphologies in the nanoscale were affected by the MW-driven polymerization conditions with respect to those obtained by classical syntheses. Therefore, MW radiation allows us to tune the morphology, size and shape of the particles from the bidimensional C=N networks which are characteristic of the NH4CN polymers by an easy, fast, low-cost and green-solvent production. These new insights make these macromolecular systems attractive for exploration in current soft-matter science.


2019 ◽  
Author(s):  
Maximilian Kasy

This essay argues that different justifiable objectives for scientific institutions lead to contradictory recommendations.We need to be explicit about our objectives in order to discuss the tradeoffs between them.Replicability and the validity of conventional statistical inference constitute one such objective, and they indeed require that publication decisions do not depend on findings. This is what motivates much of current reform efforts.Validity of inference is presumably not the only objective, however -- it could easily be achieved by estimates derived from a random number generator.Relevance of findings might be another objective.If our goal is to inform decision makers or to maximize social learning, there is a strong rationale to selectively publish surprising findings.A third objective could be the plausibility of published findings.If there is some uncertainty about the quality of studies and we want to avoid publishing incorrect results, we might want to selectively publish unsurprising findings.How can we resolve the tension between these contradictory recommendations?I will outline one possibility below, proposing a functionally differentiated publication system, with different outlets focusing on different objectives.Measures that are promoted by current reformers, such as pre-analysis plans and registered reports, would have to play a crucial role in such a system.Following these policy proposals, I will take a step back and argue that these debates raise some fundamental questions for statistical theory.In order to coherently discuss these issues, statistical theory needs a model of the work of empirical research that goes beyond the single-agent model of statistical decision theory.We should understand statistics (quantitative empirical research) as a social process of communication and collective learning that involves many different actors with differences in knowledge and expertise, different objectives, and constraints on their attention and time, who engage in strategic behavior.


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 10564-10564
Author(s):  
C. Desmedt ◽  
F. André ◽  
E. Azambuja ◽  
B. Haibe-Kains ◽  
D. Larsimont ◽  
...  

10564 Background: Breast cancers show variable sensitivity to anthracycline (A)-based therapy. Here we aimed to identify gene expression profiles associated with pCR to this treatment. As it has repeatedly and consistently been shown that ER is the most dominant factor influencing the molecular composition of breast cancer, defining different types of BC disease and because we wanted to eliminate the confounding effect of indirect ovarian suppression in ER+ BC, we focused in this study on ER-negative patients only. Methods: We analyzed Affymetrix gene expression profiles generated from 132 ER- pre-treatment samples, constituting the largest series of ER- preoperatively A-treated BC (n=132/35 pCR). Sixty-two samples derived from the prospective multicentric TOP trial (epirubicin single-agent), 41 from Institut G. Roussy (retrospective selection/ FEC) and 27 from MD Anderson (prospective study/ FAC). Results: A student t- test analysis on the combined population of A-treated pts was performed identifying 102 genes that were significantly associated with pCR (p<.01). These genes were mainly involved in cell death, DNA replication and recombination, molecular transport, cell cycle and morphology. Interestingly, 14 of these genes were located on the topoIIa amplicon. Of interest, none of these 14 genes seem to carry any prognostic value in untreated ER- pts (N=161). When we considered gene expression indices for specific A-targets such as topoIIa and helicase, we found that both were associated with pCR. However, subgroup analysis revealed that topoIIa index was predictive in ERBB2+ but not in ERBB2- subgroup. None of the genes from the adriamycin predictor (Potti et al.) or the p53 signature (Miller et al.) were significantly associated with pCR. Conclusions: This study suggests that a group of genes associated with topoIIa can identify ER-negative BC pts likely to respond to A-based therapy. These promising results are currently being validated in a larger series. No significant financial relationships to disclose.


Nano Research ◽  
2014 ◽  
Vol 7 (3) ◽  
pp. 434-442 ◽  
Author(s):  
Jiantao Feng ◽  
Fang Wang ◽  
Xinxiao Han ◽  
Zhuo Ao ◽  
Quanmei Sun ◽  
...  

2020 ◽  
Vol 34 (02) ◽  
pp. 1774-1781 ◽  
Author(s):  
Tal Alon ◽  
Magdalen Dobson ◽  
Ariel Procaccia ◽  
Inbal Talgam-Cohen ◽  
Jamie Tucker-Foltz

We consider settings where agents are evaluated based on observed features, and assume they seek to achieve feature values that bring about good evaluations. Our goal is to craft evaluation mechanisms that incentivize the agents to invest effort in desirable actions; a notable application is the design of course grading schemes. Previous work has studied this problem in the case of a single agent. By contrast, we investigate the general, multi-agent model, and provide a complete characterization of its computational complexity.


Author(s):  
Thomas Spooner ◽  
Rahul Savani

We show that adversarial reinforcement learning (ARL) can be used to produce market marking agents that are robust to adversarial and adaptively-chosen market conditions. To apply ARL, we turn the well-studied single-agent model of Avellaneda and Stoikov [2008] into a discrete-time zero-sum game between a market maker and adversary. The adversary acts as a proxy for other market participants that would like to profit at the market maker's expense. We empirically compare two conventional single-agent RL agents with ARL, and show that our ARL approach leads to: 1) the emergence of risk-averse behaviour without constraints or domain-specific penalties; 2) significant improvements in performance across a set of standard metrics, evaluated with or without an adversary in the test environment, and; 3) improved robustness to model uncertainty. We empirically demonstrate that our ARL method consistently converges, and we prove for several special cases that the profiles that we converge to correspond to Nash equilibria in a simplified single-stage game.


2009 ◽  
Vol 9 (1) ◽  
Author(s):  
Bernd Theilen

The relationship between competition and performance–related pay has been analyzed in single–principal–single–agent models. While this approach yields good predictions for managerial pay schemes, the predictions fail to apply for employees at lower tiers of a firm's hierarchy. This paper describes a principal multi-agent model of incentive pay that analyzes the effect of changes in the competitiveness of markets on lower tier incentive payment schemes. The results explain why the payment schemes of agents located at low and mid tiers are less sensitive to changes in competition when aggregated firm data is used.


Sign in / Sign up

Export Citation Format

Share Document