scholarly journals iGen: a program for the automated generation of models and parameterisations

2011 ◽  
Vol 4 (2) ◽  
pp. 843-868 ◽  
Author(s):  
D. F. Tang ◽  
S. Dobbie

Abstract. Complex physical systems can often be simulated using very high-resolution models but this is not always practical because of computational restrictions. In this case the model must be simplified or parameterised, but this is a notoriously difficult process that often requires the introduction of "model assumptions" that are hard or impossible to justify. Here we introduce a new approach to parameterising models. The approach makes use of a newly developed computer program, which we call iGen, that analyses the source code of a high-resolution model and formally derives a much faster parameterised model that closely approximates the original, reporting bounds on the error introduced by any approximations. These error bounds can be used to formally justify use of the parameterised model in subsequent numerical experiments. Using increasingly complex physical systems as examples we illustrate that iGen has the ability to produce parameterisations that run typically orders of magnitude faster than the underlying, high-resolution models from which they are derived and show that iGen has the potential to become an important tool in model development.

2011 ◽  
Vol 4 (3) ◽  
pp. 785-795 ◽  
Author(s):  
D. F. Tang ◽  
S. Dobbie

Abstract. Complex physical systems can often be simulated using very high resolution models but this is not always practical because of computational restrictions. In this case the model must be simplified or parameterised in order to make it computationally tractable. A parameterised model is created using an ad-hoc selection of techniques which range from the formal to the purely intuitive, and as a result it is very difficult to objectively quantify the fidelity of the model to the physical system. It is rare that a parameterised model can be formally shown to simulate a physical system to within some bounded error. Here we introduce a new approach to parameterising models which allows error to be formally bounded. The approach makes use of a newly developed computer program, which we call iGen, that analyses the source code of a high-resolution model and formally derives a much faster, parameterised model that closely approximates the original, reporting bounds on the error introduced by any approximations. These error bounds can be used to formally justify conclusions about a physical system based on observations of the model's behaviour. Using increasingly complex physical systems as examples we illustrate that iGen has the ability to produce parameterisations that run typically orders of magnitude faster than the underlying, high-resolution models from which they are derived.


2019 ◽  
Vol 147 (1) ◽  
pp. 329-344 ◽  
Author(s):  
Joël Stein ◽  
Fabien Stoop

Some specific scores use a neighborhood strategy in order to reduce double penalty effects, which penalize high-resolution models, compared to large-scale models. Contingency tables based on this strategy have already been proposed, but can sometimes display undesirable behavior. A new method of populating contingency tables is proposed: pairs of missed events and false alarms located in the same local neighborhood compensate in order to give pairs of hits and correct rejections. Local tables are summed up so as to provide the final table for the whole verification domain. It keeps track of the bias of the forecast when neighborhoods are taken into account. Moreover, the scores computed from this table depend on the distance between forecast and observed patterns. This method is applied to binary and multicategorical events in a simplified framework so as to present the method and to compare the new tables with previous neighborhood-based contingency tables. The new tables are then used for the verification of two models operational at Météo-France: AROME, a high-resolution model, and ARPEGE, a large-scale global model. The comparison of several contingency scores shows that the importance of the double penalty decreases more for AROME than for ARPEGE when the neighboring size increases. Scores designed for rare events are also applied to these neighborhood-based contingency tables.


2011 ◽  
Vol 4 (2) ◽  
pp. 971-995 ◽  
Author(s):  
D. F. Tang ◽  
S. Dobbie

Abstract. In a previous paper we described a new technique for automatically generating parameterisations using a program called iGen. iGen generates parameterisations by analysing the source code of a high resolution model that resolves the physics to be parameterised. In order to demonstrate that this technique scales up to deal with models of realistic complexity we have used iGen to generate a parameterisation of entrainment in marine stratocumulus. We present details of our technique in which iGen was used to analyse the source code of a cloud resolving model and generate a parameterisation of the mean and standard deviation of entrainment velocity in marine stratocumulus in terms of the large-scale state of the boundary layer. The parameterisation was tested against results from the DYCOMS-II intercomparison of cloud resolving models and iGen's parameterisation of mean entrainment velocity was found to be 5.27 × 10−3 ± 0.62 × 10−3 m s−1 compared to 5.2 × 10−3 ± 0.8 × 10−3 m s−1 for the DYCOMS-II ensemble of cloud resolving models.


2017 ◽  
Vol 45 (3) ◽  
pp. 652-663 ◽  
Author(s):  
Anna L. Carter ◽  
Michael R. Kearney ◽  
Stephen Hartley ◽  
Warren P. Porter ◽  
Nicola J. Nelson

2005 ◽  
Vol 12 (5) ◽  
pp. 755-765 ◽  
Author(s):  
I. Hoteit ◽  
G. Korres ◽  
G. Triantafyllou

Abstract. Kalman filters are widely used for data assimilation into ocean models. The aim of this study is to discuss the relevance of these filters with high resolution ocean models. This was investigated through the comparison of two advanced Kalman filters: the singular evolutive extended Kalman (SEEK) filter and its ensemble-based variant, called SEIK filter. The two filters were implemented with the Princeton Ocean model (POM) considering a low spatial resolution configuration (Mediterranean sea model) and a very high one (Pagasitikos Gulf coastal model). It is shown that the two filters perform reasonably well when applied with the low resolution model. However, when the high resolution model is considered, the behavior of the SEEK filter seriously degrades because of strong model nonlinearities while the SEIK filter remains remarkably more stable. Based on the assumption of prior Gaussian distributions, the linear analysis step of the latter can still be improved though.


2008 ◽  
Vol 136 (11) ◽  
pp. 4113-4129 ◽  
Author(s):  
Neill E. Bowler ◽  
Alberto Arribas ◽  
Kenneth R. Mylne

Abstract A new approach to probabilistic forecasting is proposed, based on the generation of an ensemble of equally likely analyses of the current state of the atmosphere. The rationale behind this approach is to mimic a poor man’s ensemble, which combines the deterministic forecasts from national meteorological services around the world. The multianalysis ensemble aims to generate a series of forecasts that are both as skillful as each other and the control forecast. This produces an ensemble mean forecast that is superior not only to the ensemble members, but to the control forecast in the short range even for slowly varying parameters, such as 500-hPa height. This is something that it is not possible with traditional ensemble methods, which perturb a central analysis. The results herein show that the multianalysis ensemble is more skillful than the Met Office’s high-resolution forecast by 4.5% over the first 3 days (on average as measured for RMSE). Similar results are found for different verification scores and various regions of the globe. In contrast, the ensemble mean for the ensemble currently run by the Met Office performs 1.5% worse than the high-resolution forecast (similar results are found for the ECMWF ensemble). It is argued that the multianalysis approach is therefore superior to current ensemble methods. The multianalysis results were achieved with a two-member ensemble: the forecast from a high-resolution model plus a low-resolution perturbed model. It may be possible to achieve greater improvements with a larger ensemble.


2011 ◽  
Vol 4 (3) ◽  
pp. 797-807 ◽  
Author(s):  
D. F. Tang ◽  
S. Dobbie

Abstract. In a previous paper we described a new technique for automatically generating parameterisations using a program called iGen. iGen generates parameterisations by analysing the source code of a~high resolution model that resolves the physics to be parameterised. In order to demonstrate that this technique scales up to deal with models of realistic complexity we have used iGen to generate a parameterisation of entrainment in marine stratocumulus. We describe how iGen was used to analyse the source code of an eddy resolving model (ERM) and generate a parameterisation of entrainment velocity in marine stratocumulus in terms of the large-scale state of the boundary layer. The parameterisation was tested against results from the DYCOMS-II intercomparison of ERM models and iGen's parameterisation of mean entrainment velocity was found to be 5.27 × 10−3 ± 0.62 × 10−3 m s−1 compared to 5.2 × 10−3 ± 0.8 × 10−3 m s−1 for the DYCOMS-II ensemble of large eddy simulation (LES) models.


Sign in / Sign up

Export Citation Format

Share Document