modelling assumptions
Recently Published Documents


TOTAL DOCUMENTS

179
(FIVE YEARS 67)

H-INDEX

17
(FIVE YEARS 5)

Author(s):  
Alessandro Gasparini ◽  
Tim P. Morris ◽  
Michael J. Crowther

Simulation studies allow us to explore the properties of statistical methods.They provide a powerful tool with a multiplicity of aims; among others: evaluating and comparing new or existing statistical methods, assessing violations of modelling assumptions, helping with the understanding of statistical concepts, and supporting the design of clinical trials.The increased availability of powerful computational tools and usable software has contributed to the rise of simulation studies in the current literature.However, simulation studies involve increasingly complex designs, making it difficult to provide all relevant results clearly.Dissemination of results plays a focal role in simulation studies: it can drive applied analysts to use methods that have been shown to perform well in their settings, guide researchers to develop new methods in a promising direction, and provide insights into less established methods.It is crucial that we can digest relevant results of simulation studies.Therefore, we developed INTEREST: an INteractive Tool for Exploring REsults from Simulation sTudies.The tool has been developed using the Shiny framework in R and is available as a web app or as a standalone package.It requires uploading a tidy format dataset with the results of a simulation study in R, Stata, SAS, SPSS, or comma-separated format.A variety of performance measures are estimated automatically along with Monte Carlo standard errors; results and performance summaries are displayed both in tabular and graphical fashion, with a wide variety of available plots.Consequently, the reader can focus on simulation parameters and estimands of most interest.In conclusion, INTEREST can facilitate the investigation of results from simulation studies and supplement the reporting of results, allowing researchers to share detailed results from their simulations and readers to explore them freely.


2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Hao Wang

Currently, many methods that could estimate the effects of conditions on a given biological target require either strong modelling assumptions or separate screens. Traditionally, many conditions and targets, without doing all possible experiments, could be achieved by driven experimentation or several mathematical methods, especially conversational machine learning methods. However, these methods still could not avoid and replace manual labels completely. This paper presented a meta-active machine learning method to resolve this problem. This project has used nine traditional machine learning methods to compare their accuracy and running time. In addition, this paper analyzes the meta-active machine learning method (MAML) compared with a classical screening method and progressive experiments. The obtained results show that applying this method yields the best experimental results on the current dataset.


2021 ◽  
Author(s):  
Manuel Chevalier

Abstract. Statistical climate reconstruction techniques are practical tools to study past climate variability from fossil proxy data. In particular, the methods based on probability density functions (PDFs) are powerful at producing robust results from various environments and proxies. However, accessing and curating the necessary calibration data, as well as the complexity of interpreting probabilistic results, often limit their use in palaeoclimatological studies. To address these problems, I present a new R package (crestr) to apply the CREST method (Climate REconstruction SofTware) on diverse palaeoecological datasets. crestr includes a globally curated calibration dataset for six common climate proxies (i.e. plants, beetles, chironomids, rodents, foraminifera, and dinoflagellate cysts) that enables its use in most terrestrial and marine regions. The package can also be used with private data collections instead of, or in combination with, the provided dataset. It also includes a suite of graphical diagnostic tools to represent the data at each step of the reconstruction process and provide insights into the effect of the different modelling assumptions and external factors that underlie a reconstruction. With this R package, the CREST method can now be used in a scriptable environment, thus simplifying its use and integration in existing workflows. It is hoped that crestr will contribute to producing the much-needed quantified records from the many regions where climate reconstructions are currently lacking, despite the existence of suitable fossil records.


2021 ◽  
Author(s):  
Alastair D Jamieson-Lane ◽  
Alexander Friedrich ◽  
Bernd Blasius

Clinicians prescribing antibiotics in a hospital context follow one of several possible "treatment protocols" - heuristic rules designed to balance the immediate needs of patients against the long term threat posed by the evolution of antibiotic resistance and multi-resistant bacteria. Several criteria have been proposed for assessing these protocols, unfortunately these criteria frequently conflict with one another, each providing a different recommendation as to which treatment protocol is best. Here we review and compare these optimization criteria. We are able to demonstrate that criteria focused primarily on slowing evolution of resistance are directly antagonistic to patient health both in the short and long term. We provide a new optimization criteria of our own, intended to more meaningfully balance the needs of the future and present. Asymptotic methods allow us to evaluate this criteria and provide insights not readily available through the numerical methods used previously in the literature. When cycling antibiotics, we find an antibiotic switching time which proves close to optimal across a wide range of modelling assumptions.


2021 ◽  
Author(s):  
Charles D L Mullon ◽  
Laurent Lehmann

From protists to primates, intergroup aggression and warfare over resources has been observed in several taxa whose populations typically consist of groups connected by limited genetic mixing. Here, we model the co-evolution between four traits relevant to this setting: (i) investment into common-pool resource production within groups ('helping'); (ii) proclivity to raid other groups to appropriate their resources ('belligerence'); and investments into (iii) defense and (iv) offense of group contests ('defensive and offensive bravery'). We show that when traits co-evolve, the population often experiences disruptive selection favouring two morphs: 'Hawks', who express high levels of both belligerence and offensive bravery; and 'Doves', who express neither. This social polymorphism involves further among-traits associations when the fitness costs of helping and bravery interact. In particular if helping is antagonistic with both forms of bravery, co-evolution leads to the coexistence of individuals that either: (i) do not participate into common-pool resource production but only in its defense and appropriation ('Scrounger Hawks'); or (ii) only invest into common pool resource production ('Producer Doves'). Provided groups are not randomly mixed, these findings are robust to several modelling assumptions. This suggests that inter-group aggression is a potent mechanism in favoring within-group social diversity and behavioural syndromes.


Author(s):  
G. W. RICHARDSON ◽  
J. M. FOSTER ◽  
R. RANOM ◽  
C. P. PLEASE ◽  
A. M. RAMOS

This paper presents the current state of mathematical modelling of the electrochemical behaviour of lithium-ion batteries (LIBs) as they are charged and discharged. It reviews the models developed by Newman and co-workers, both in the cases of dilute and moderately concentrated electrolytes and indicates the modelling assumptions required for their development. Particular attention is paid to the interface conditions imposed between the electrolyte and the active electrode material; necessary conditions are derived for one of these, the Butler–Volmer relation, in order to ensure physically realistic solutions. Insight into the origin of the differences between various models found in the literature is revealed by considering formulations obtained by using different measures of the electric potential. Materials commonly used for electrodes in LIBs are considered and the various mathematical models used to describe lithium transport in them discussed. The problem of upscaling from models of behaviour at the single electrode particle scale to the cell scale is addressed using homogenisation techniques resulting in the pseudo-2D model commonly used to describe charge transport and discharge behaviour in lithium-ion cells. Numerical solution to this model is discussed and illustrative results for a common device are computed.


Energies ◽  
2021 ◽  
Vol 14 (20) ◽  
pp. 6841
Author(s):  
Giovanna De Luca ◽  
Franz Bianco Mauthe Degerfeld ◽  
Ilaria Ballarini ◽  
Vincenzo Corrado

The recently issued EN ISO 52016-1 technical standard provides a new simplified dynamic method for the building energy performance assessment. Since an extensive validation of the EN ISO 52016-1 hourly method is still missing, the present work investigates the effect of the main modelling assumptions—related to the heat balance on the outdoor and the indoor envelope surfaces—on the building thermal behaviour. The model validation was carried out by assessing the accuracy variation consequent to the application of the EN ISO 52016-1 modelling assumptions to a detailed dynamic calculation tool (EnergyPlus). To guarantee a general validity of the outcomes, two buildings, two levels of thermal insulation, and two Italian climatic zones were considered, for a total of eight case studies. To explore different applications of the standard method, the analysis was performed both under a free-floating condition—to evaluate the accuracy of the model in predicting the indoor operative temperatures—and to assess the annual energy needs for space heating and cooling. Results show that the assumptions related to the definition of the external convective and the shortwave (solar) radiation heat transfer lead to non-negligible inaccuracies in the EN ISO 52016-1 hourly model.


Methodology ◽  
2021 ◽  
Vol 17 (3) ◽  
pp. 205-230
Author(s):  
Kristian Kleinke ◽  
Markus Fritsch ◽  
Mark Stemmler ◽  
Jost Reinecke ◽  
Friedrich Lösel

Quantile regression (QR) is a valuable tool for data analysis and multiple imputation (MI) of missing values – especially when standard parametric modelling assumptions are violated. Yet, Monte Carlo simulations that systematically evaluate QR-based MI in a variety of different practically relevant settings are still scarce. In this paper, we evaluate the method regarding the imputation of ordinal data and compare the results with other standard and robust imputation methods. We then apply QR-based MI to an empirical dataset, where we seek to identify risk factors for corporal punishment of children by their fathers. We compare the modelling results with previously published findings based on complete cases. Our Monte Carlo results highlight the advantages of QR-based MI over fully parametric imputation models: QR-based MI yields unbiased statistical inferences across large parts of the conditional distribution, when parametric modelling assumptions, such as normal and homoscedastic error terms, are violated. Regarding risk factors for corporal punishment, our MI results support previously published findings based on complete cases. Our empirical results indicate that the identified “missing at random” processes in the investigated dataset are negligible.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Shubham Srivastava ◽  
Abhishek Srivastava ◽  
Sanya Jain ◽  
Nandan Kumar ◽  
Chandra Shekhar Malvi

Purpose This study aims to analyse the variations of thermal comfort inside a building space by using different curtains. Design/methodology/approach Phase change materials (PCMs) such as wax, sand and mixture of sand and wax were used with cotton curtain to compare the results of PCM curtains with the performance of normal cotton curtain against constant heat exposure. Heat exposure was provided with halogen to simulate the solar radiation. Further simulation was performed on ANSYS and experimental results were compared with the simulation results. In addition to this, the results were analysed for optimized performance by calculation root mean square error. Findings It was found that PCM used curtains that have better performance than normal curtain. Furthermore, sand curtain was proved as the best curtain and mixture of sand and wax curtain could replace the sand curtain where there is limitation of weight; also, there was less error between experimental and simulation was reported for sand curtain as compare to other curtains. Research limitations/implications Layers of different PCMs were used before cotton curtain and in modelling assumptions such as one-dimensional heat transfer, uniform thermal conductivity. Originality/value To the best of the authors’ knowledge, there is no such study that was performed earlier.


PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0257455
Author(s):  
Simon N. Wood ◽  
Ernst C. Wit

Detail is a double edged sword in epidemiological modelling. The inclusion of mechanistic detail in models of highly complex systems has the potential to increase realism, but it also increases the number of modelling assumptions, which become harder to check as their possible interactions multiply. In a major study of the Covid-19 epidemic in England, Knock et al. (2020) fit an age structured SEIR model with added health service compartments to data on deaths, hospitalization and test results from Covid-19 in seven English regions for the period March to December 2020. The simplest version of the model has 684 states per region. One main conclusion is that only full lockdowns brought the pathogen reproduction number, R, below one, with R ≫ 1 in all regions on the eve of March 2020 lockdown. We critically evaluate the Knock et al. epidemiological model, and the semi-causal conclusions made using it, based on an independent reimplementation of the model designed to allow relaxation of some of its strong assumptions. In particular, Knock et al. model the effect on transmission of both non-pharmaceutical interventions and other effects, such as weather, using a piecewise linear function, b(t), with 12 breakpoints at selected government announcement or intervention dates. We replace this representation by a smoothing spline with time varying smoothness, thereby allowing the form of b(t) to be substantially more data driven, and we check that the corresponding smoothness assumption is not driving our results. We also reset the mean incubation time and time from first symptoms to hospitalisation, used in the model, to values implied by the papers cited by Knock et al. as the source of these quantities. We conclude that there is no sound basis for using the Knock et al. model and their analysis to make counterfactual statements about the number of deaths that would have occurred with different lockdown timings. However, if fits of this epidemiological model structure are viewed as a reasonable basis for inference about the time course of incidence and R, then without very strong modelling assumptions, the pathogen reproduction number was probably below one, and incidence in substantial decline, some days before either of the first two English national lockdowns. This result coincides with that obtained by more direct attempts to reconstruct incidence. Of course it does not imply that lockdowns had no effect, but it does suggest that other non-pharmaceutical interventions (NPIs) may have been much more effective than Knock et al. imply, and that full lockdowns were probably not the cause of R dropping below one.


Sign in / Sign up

Export Citation Format

Share Document