scholarly journals An automatic and effective parameter optimization method for model tuning

2015 ◽  
Vol 8 (11) ◽  
pp. 3579-3591 ◽  
Author(s):  
T. Zhang ◽  
L. Li ◽  
Y. Lin ◽  
W. Xue ◽  
F. Xie ◽  
...  

Abstract. Physical parameterizations in general circulation models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time-consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determining the model's sensitivity to the parameters and the other choosing the optimum initial value for those sensitive parameters, are introduced before the downhill simplex method. This new method reduces the number of parameters to be tuned and accelerates the convergence of the downhill simplex method. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.

2015 ◽  
Vol 8 (5) ◽  
pp. 3791-3822
Author(s):  
T. Zhang ◽  
L. Li ◽  
Y. Lin ◽  
W. Xue ◽  
F. Xie ◽  
...  

Abstract. Physical parameterizations in General Circulation Models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determines parameter sensitivity and the other chooses the optimum initial value of sensitive parameters, are introduced before the downhill simplex method to reduce the computational cost and improve the tuning performance. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9%. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameters tuning during the model development stage.


2017 ◽  
Vol 10 (7) ◽  
pp. 2547-2566 ◽  
Author(s):  
Keith D. Williams ◽  
Alejandro Bodas-Salcedo

Abstract. Most studies evaluating cloud in general circulation models present new diagnostic techniques or observational datasets, or apply a limited set of existing diagnostics to a number of models. In this study, we use a range of diagnostic techniques and observational datasets to provide a thorough evaluation of cloud, such as might be carried out during a model development process. The methodology is illustrated by analysing two configurations of the Met Office Unified Model – the currently operational configuration at the time of undertaking the study (Global Atmosphere 6, GA6), and the configuration which will underpin the United Kingdom's Earth System Model for CMIP6 (Coupled Model Intercomparison Project 6; GA7). By undertaking a more comprehensive analysis which includes compositing techniques, comparing against a set of quite different observational instruments and evaluating the model across a range of timescales, the risks of drawing the wrong conclusions due to compensating model errors are minimized and a more accurate overall picture of model performance can be drawn. Overall the two configurations analysed perform well, especially in terms of cloud amount. GA6 has excessive thin cirrus which is removed in GA7. The primary remaining errors in both configurations are the in-cloud albedos which are too high in most Northern Hemisphere cloud types and sub-tropical stratocumulus, whilst the stratocumulus on the cold-air side of Southern Hemisphere cyclones has in-cloud albedos which are too low.


2017 ◽  
Author(s):  
Keith D. Williams ◽  
Alejandro Bodas-Salcedo

Abstract. Most studies evaluating cloud in general circulation models present new diagnostic techniques or observational datasets, or apply a limited set of existing diagnostics to a number of models. In this study, we use a range of diagnostic techniques and observational datasets to provide a thorough evaluation of cloud, such as might be carried out during a model development process. The methodology is illustrated by analysing two configurations of the Met Office Unified Model – the currently operational configuration at the time of undertaking the study (Global Atmosphere 6, GA6), and the configuration which will underpin the United Kingdom's Earth System Model for CMIP6 (Coupled Model Intercomparison Project 6) (GA7). By undertaking a more comprehensive analysis which includes compositing techniques, comparing against a set of quite different observational instruments and evaluating the model across a range of timescales, the risks of drawing the wrong conclusions due to compensating model errors are minimised and a more accurate overall picture of model performance can be drawn. Overall the two configurations analysed perform well, especially in terms of cloud amount. GA6 has excessive thin cirrus which is removed in GA7. The primary remaining errors in both configurations are the in-cloud albedos which are too high in most northern hemisphere cloud types and sub-tropical stratocumulus, whilst the stratocumulus on the cold air side of southern hemisphere cyclones has in-cloud albedo's which are too low.


2021 ◽  
Author(s):  
Julie Deshayes

<p>When comparing realistic simulations produced by two ocean general circulation models, differences may emerge from alternative choices in boundary conditions and forcings, which alters our capacity to identify the actual differences between the two models (in the equations solved, the discretization schemes employed and/or the parameterizations introduced). The use of idealised test cases (idealized configurations with analytical boundary conditions and forcings, resolving a given set of equations) has proven efficient to reveal numerical bugs, determine advantages and pitfalls of certain numerical choices, and highlight remaining challenges. I propose to review historical progress enabled by the use of idealised test cases, and promote their utilization when assessing ocean dynamics as represented by an ocean model. For the latter, I would illustrate my talk using illustrations from my own research activities using NEMO in various contexts. I also see idealised test cases as a promising training tool for inexperienced ocean modellers, and an efficient solution to enlarge collaboration with experts in adjacent disciplines, such as mathematics, fluid dynamics and computer sciences.</p>


2002 ◽  
Vol 39 (5) ◽  
pp. 1181-1192 ◽  
Author(s):  
Erick J Baziw

The seismic cone penetration test (SCPT) has proven to be a very valuable geotechnical tool in facilitating the determination of low strain (<10–4%) in situ compression (P) and shear (S) wave velocities. The P- and S-wave velocities are directly related to the soil elastic constants of Poisson's ratio, shear modulus, bulk modulus, and Young's modulus. The accurate determination of P- and S-wave velocities from the recorded seismic cone time series is of paramount importance to the evaluation of reliable elastic constants. Furthermore, since the shear and compression wave velocities are squared in deriving the elastic constants, small variations in the estimated velocities can cause appreciable errors. The standard techniques implemented in deriving SCPT interval velocities rely upon obtaining reference P- and S-wave arrival times as the probe is advanced into the soil profile. By assuming a straight ray travel path from the source to the SCPT seismic receiver and calculating the relative reference arrival time differences, interval SCPT velocities are obtained. The forward modeling – downhill simplex method (FMDSM) outlined in this paper offers distinct advantages over conventional SCPT velocity profile estimation methods. Some of these advantages consist of the allowance of ray path refraction, greater sophistication in interval velocity determination, incorporation of measurement weights, and meaningful interval velocity accuracy estimators.Key words: seismic cone penetration testing (SCPT), downhill simplex method (DSM), forward modeling, Fermat's principle, weighted least squares (l2 norm), cost function.


2011 ◽  
Vol 50 (8) ◽  
pp. 1666-1675 ◽  
Author(s):  
Satoru Yokoi ◽  
Yukari N. Takayabu ◽  
Kazuaki Nishii ◽  
Hisashi Nakamura ◽  
Hirokazu Endo ◽  
...  

AbstractThe overall performance of general circulation models is often investigated on the basis of the synthesis of a number of scalar performance metrics of individual models that measure the reproducibility of diverse aspects of the climate. Because of physical and dynamic constraints governing the climate, a model’s performance in simulating a certain aspect of the climate is sometimes related closely to that in simulating another aspect, which results in significant intermodel correlation between performance metrics. Numerous metrics and intermodel correlations may cause a problem in understanding the evaluation and synthesizing the metrics. One possible way to alleviate this problem is to group the correlated metrics beforehand. This study attempts to use simple cluster analysis to group 43 performance metrics. Two clustering methods, the K-means and the Ward methods, yield considerably similar clustering results, and several aspects of the results are found to be physically and dynamically reasonable. Furthermore, the intermodel correlation between the cluster averages is considerably lower than that between the metrics. These results suggest that the cluster analysis is helpful in obtaining the appropriate grouping. Applications of the clustering results are also discussed.


Sign in / Sign up

Export Citation Format

Share Document