scholarly journals An Ensemble-Based Statistical Methodology to Detect Differences in Weather and Climate Model Executables

2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

Abstract. Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are subject to continuous changes, thanks to advances in computer systems, numerical methods, more and better observations, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today’s models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software updates. Such changes are not supposed to significantly affect the model. However, this is difficult to verify, because our atmosphere is a chaotic system, where even a tiny change can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework. Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using a set of simulations with slightly different initial conditions in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different underlying statistical hypothesis tests. We present first applications of the methodology with a regional weather and climate model, including the verification of a major system update of the underlying supercomputer. While providing very robust results, the methodology shows a great sensitivity even to tiny changes. Results show that changes are often only detectable during the first hours, which suggests that short-term simulations (days to months) are best suited for the methodology, even when addressing long-term climate simulations. We also show that the choice of the underlying statistical hypothesis test is not of importance and that the methodology already works well for coarse resolutions, making it computationally inexpensive and therefore an ideal candidate for automated testing.

2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2019 ◽  
Vol 19 (2) ◽  
pp. 134-140
Author(s):  
Baek-Ju Sung ◽  
Sung-kyu Lee ◽  
Mu-Seong Chang ◽  
Do-Sik Kim

2018 ◽  
Vol 11 (9) ◽  
pp. 3647-3657 ◽  
Author(s):  
Nathan Luke Abraham ◽  
Alexander T. Archibald ◽  
Paul Cresswell ◽  
Sam Cusworth ◽  
Mohit Dalvi ◽  
...  

Abstract. The Met Office Unified Model (UM) is a state-of-the-art weather and climate model that is used operationally worldwide. UKCA is the chemistry and aerosol sub model of the UM that enables interactive composition and physical atmosphere interactions, but which adds an additional 120 000 lines of code to the model. Ensuring that the UM code and UM-UKCA (the UM running with interactive chemistry and aerosols) is well tested is thus essential. While a comprehensive test harness is in place at the Met Office and partner sites to aid in development, this is not available to many UM users. Recently, the Met Office have made available a virtual machine environment that can be used to run the UM on a desktop or laptop PC. Here we describe the development of a UM-UKCA configuration that is able to run within this virtual machine while only needing 6 GB of memory, before discussing the applications of this system for model development, testing, and training.


2021 ◽  
Author(s):  
Abhijit Mahesh Chinchani ◽  
Mahesh Menon ◽  
Meighen Roes ◽  
Heungsun Hwang ◽  
Paul Allen ◽  
...  

Cognitive mechanisms hypothesized to underlie hallucinatory experiences (HEs) include dysfunctional source monitoring, heightened signal detection, or impaired attentional processes. HEs can be very pronounced in psychosis, but similar experiences also occur in nonclinical populations. Using data from an international multisite study on nonclinical subjects (N = 419), we described the overlap between two sets of variables - one measuring cognition and the other HEs - at the level of individual items, allowing extraction of item-specific signal which might considered off-limits when summary scores are analyzed. This involved using a statistical hypothesis test at the multivariate level, and variance constraints, dimension reduction, and split-half reliability checks at the level of individual items. The results showed that (1) modality-general HEs involving sensory distortions (hearing voices/sounds, troubled by voices, everyday things look abnormal, sensations of presence/movement) were associated with more liberal auditory signal detection, and (2) HEs involving experiences of sensory overload and vivid images/imagery (viz., HEs for faces and intense daydreams) were associated with other-ear distraction and reduced laterality in dichotic listening. Based on these results, it is concluded that the overlap between HEs and cognition variables can be conceptualized as modality-general and bi-dimensional: one involving distortions, and the other involving overload or intensity.


Computation ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 59 ◽  
Author(s):  
Giovanni Delnevo ◽  
Silvia Mirri ◽  
Marco Roccetti

As we prepare to emerge from an extensive and unprecedented lockdown period, due to the COVID-19 virus infection that hit the Northern regions of Italy with the Europe’s highest death toll, it becomes clear that what has gone wrong rests upon a combination of demographic, healthcare, political, business, organizational, and climatic factors that are out of our scientific scope. Nonetheless, looking at this problem from a patient’s perspective, it is indisputable that risk factors, considered as associated with the development of the virus disease, include older age, history of smoking, hypertension and heart disease. While several studies have already shown that many of these diseases can also be favored by a protracted exposure to air pollution, there has been recently an insurgence of negative commentary against authors who have correlated the fatal consequences of COVID-19 (also) to the exposition of specific air pollutants. Well aware that understanding the real connection between the spread of this fatal virus and air pollutants would require many other investigations at a level appropriate to the scale of this phenomenon (e.g., biological, chemical, and physical), we propose the results of a study, where a series of the measures of the daily values of PM2.5, PM10, and NO2 were considered over time, while the Granger causality statistical hypothesis test was used for determining the presence of a possible correlation with the series of the new daily COVID19 infections, in the period February–April 2020, in Emilia-Romagna. Results taken both before and after the governmental lockdown decisions show a clear correlation, although strictly seen from a Granger causality perspective. Moving beyond the relevance of our results towards the real extent of such a correlation, our scientific efforts aim at reinvigorating the debate on a relevant case, that should not remain unsolved or no longer investigated.


2015 ◽  
Vol 117 (2) ◽  
pp. 131-141 ◽  
Author(s):  
Michael Baltaxe ◽  
Peter Meer ◽  
Michael Lindenbaum

2021 ◽  
Vol 11 (3) ◽  
pp. 697-702
Author(s):  
S. Jayanthi ◽  
C. R. Rene Robin

In this study, DNA microarray data is analyzed from a signal processing perspective for cancer classification. An adaptive wavelet transform named Empirical Wavelet Transform (EWT) is analyzed using block-by-block procedure to characterize microarray data. The EWT wavelet basis depends on the input data rather predetermined like in conventional wavelets. Thus, EWT gives more sparse representations than wavelets. The characterization of microarray data is made by block-by-block procedure with predefined block sizes in powers of 2 that starts from 128 to 2048. After characterization, a statistical hypothesis test is employed to select the informative EWT coefficients. Only the selected coefficients are used for Microarray Data Classification (MDC) by the Support Vector Machine (SVM). Computational experiments are employed on five microarray datasets; colon, breast, leukemia, CNS and ovarian to test the developed cancer classification system. The obtained results demonstrate that EWT coefficients with SVM emerged as an effective approach with no misclassification for MDC system.


Sign in / Sign up

Export Citation Format

Share Document