scholarly journals How Much Should Climate Model Output Be Smoothed in Space?

2011 ◽  
Vol 24 (3) ◽  
pp. 867-880 ◽  
Author(s):  
Jouni Räisänen ◽  
Jussi S. Ylhäisi

Abstract The general decrease in the quality of climate model output with decreasing scale suggests a need for spatial smoothing to suppress the most unreliable small-scale features. However, even if correctly simulated, a large-scale average retained by the smoothing may not be representative of the local conditions, which are of primary interest in many impact studies. Here, the authors study this trade-off using simulations of temperature and precipitation by 24 climate models within the Third Coupled Model Intercomparison Project, to find the scale of smoothing at which the mean-square difference between smoothed model output and gridbox-scale reality is minimized. This is done for present-day time mean climate, recent temperature trends, and projections of future climate change, using cross validation between the models for the latter. The optimal scale depends strongly on the number of models used, being much smaller for multimodel means than for individual model simulations. It also depends on the variable considered and, in the case of climate change projections, the time horizon. For multimodel-mean climate change projections for the late twenty-first century, only very slight smoothing appears to be beneficial, and the resulting potential improvement is negligible for practical purposes. The use of smoothing as a means to improve the sampling for probabilistic climate change projections is also briefly explored.

2021 ◽  
Author(s):  
Michael Steininger ◽  
Daniel Abel ◽  
Katrin Ziegler ◽  
Anna Krause ◽  
Heiko Paeth ◽  
...  

<p>Climate models are an important tool for the assessment of prospective climate change effects but they suffer from systematic and representation errors, especially for precipitation. Model output statistics (MOS) reduce these errors by fitting the model output to observational data with machine learning. In this work, we explore the feasibility and potential of deep learning with convolutional neural networks (CNNs) for MOS. We propose the CNN architecture ConvMOS specifically designed for reducing errors in climate model outputs and apply it to the climate model REMO. Our results show a considerable reduction of errors and mostly improved performance compared to three commonly used MOS approaches.</p>


2014 ◽  
Vol 9 (11) ◽  
pp. 115008 ◽  
Author(s):  
G Lenderink ◽  
B J J M van den Hurk ◽  
A M G Klein Tank ◽  
G J van Oldenborgh ◽  
E van Meijgaard ◽  
...  

2013 ◽  
Vol 7 (2) ◽  
pp. 181-198 ◽  
Author(s):  
Maximilian Auffhammer ◽  
Solomon M. Hsiang ◽  
Wolfram Schlenker ◽  
Adam Sobel

2021 ◽  
Vol 14 (1) ◽  
pp. 107-124
Author(s):  
◽  
Karthik Kashinath ◽  
Mayur Mudigonda ◽  
Sol Kim ◽  
Lukas Kapp-Schwoerer ◽  
...  

Abstract. Identifying, detecting, and localizing extreme weather events is a crucial first step in understanding how they may vary under different climate change scenarios. Pattern recognition tasks such as classification, object detection, and segmentation (i.e., pixel-level classification) have remained challenging problems in the weather and climate sciences. While there exist many empirical heuristics for detecting extreme events, the disparities between the output of these different methods even for a single event are large and often difficult to reconcile. Given the success of deep learning (DL) in tackling similar problems in computer vision, we advocate a DL-based approach. DL, however, works best in the context of supervised learning – when labeled datasets are readily available. Reliable labeled training data for extreme weather and climate events is scarce. We create “ClimateNet” – an open, community-sourced human-expert-labeled curated dataset that captures tropical cyclones (TCs) and atmospheric rivers (ARs) in high-resolution climate model output from a simulation of a recent historical period. We use the curated ClimateNet dataset to train a state-of-the-art DL model for pixel-level identification – i.e., segmentation – of TCs and ARs. We then apply the trained DL model to historical and climate change scenarios simulated by the Community Atmospheric Model (CAM5.1) and show that the DL model accurately segments the data into TCs, ARs, or “the background” at a pixel level. Further, we show how the segmentation results can be used to conduct spatially and temporally precise analytics by quantifying distributions of extreme precipitation conditioned on event types (TC or AR) at regional scales. The key contribution of this work is that it paves the way for DL-based automated, high-fidelity, and highly precise analytics of climate data using a curated expert-labeled dataset – ClimateNet. ClimateNet and the DL-based segmentation method provide several unique capabilities: (i) they can be used to calculate a variety of TC and AR statistics at a fine-grained level; (ii) they can be applied to different climate scenarios and different datasets without tuning as they do not rely on threshold conditions; and (iii) the proposed DL method is suitable for rapidly analyzing large amounts of climate model output. While our study has been conducted for two important extreme weather patterns (TCs and ARs) in simulation datasets, we believe that this methodology can be applied to a much broader class of patterns and applied to observational and reanalysis data products via transfer learning.


2016 ◽  
Vol 20 (2) ◽  
pp. 685-696 ◽  
Author(s):  
E. P. Maurer ◽  
D. L. Ficklin ◽  
W. Wang

Abstract. Statistical downscaling is a commonly used technique for translating large-scale climate model output to a scale appropriate for assessing impacts. To ensure downscaled meteorology can be used in climate impact studies, downscaling must correct biases in the large-scale signal. A simple and generally effective method for accommodating systematic biases in large-scale model output is quantile mapping, which has been applied to many variables and shown to reduce biases on average, even in the presence of non-stationarity. Quantile-mapping bias correction has been applied at spatial scales ranging from hundreds of kilometers to individual points, such as weather station locations. Since water resources and other models used to simulate climate impacts are sensitive to biases in input meteorology, there is a motivation to apply bias correction at a scale fine enough that the downscaled data closely resemble historically observed data, though past work has identified undesirable consequences to applying quantile mapping at too fine a scale. This study explores the role of the spatial scale at which the quantile-mapping bias correction is applied, in the context of estimating high and low daily streamflows across the western United States. We vary the spatial scale at which quantile-mapping bias correction is performed from 2° ( ∼  200 km) to 1∕8° ( ∼  12 km) within a statistical downscaling procedure, and use the downscaled daily precipitation and temperature to drive a hydrology model. We find that little additional benefit is obtained, and some skill is degraded, when using quantile mapping at scales finer than approximately 0.5° ( ∼  50 km). This can provide guidance to those applying the quantile-mapping bias correction method for hydrologic impacts analysis.


2013 ◽  
Vol 726-731 ◽  
pp. 3451-3456
Author(s):  
Yu Zhi Shi ◽  
Hai Jiao Liu ◽  
Ming Yuan Fan ◽  
Ji Wen Huang

Single climate model has much uncertainty on quantifying climate change, this paper proposes a new method, reliability ensemble averaging based on Bayesian weighted average (REA-BMA) to calculate the comprehensive climate change and then combines large-scale distributed watershed hydrologic cycling model SWAT to quantifying the effect of future climate change on basin water resources. The data sets from 1961 to 2040 of GCM models (HadCM3, CGCM3, BCCR, CSIRO) and three emission scenarios (A2, A1B and B2) are taking for uncertainty analysis and Huntai River basin in China is selected as study case. The results show that the proposed method could efficiently quantify the affect of climate change on watershed hydrologic cycle.


2009 ◽  
Vol 6 (6) ◽  
pp. 7143-7178 ◽  
Author(s):  
T. L. A. Driessen ◽  
R. T. W. L. Hurkmans ◽  
W. Terink ◽  
P. Hazenberg ◽  
P. J. J. F. Torfs ◽  
...  

Abstract. The Meuse is an important river in western Europe, and almost exclusively rain-fed. Projected changes in precipitation characteristics due to climate change, therefore, are expected to have a considerable effect on the hydrological regime of the river Meuse. We focus on an important tributary of the Meuse, the Ourthe, measuring about 1600 km2. The well-known hydrological model HBV is forced with three high-resolution (0.088°) regional climate scenarios, each based on one of three different IPCC CO2 emission scenarios: A1B, A2 and B1. To represent the current climate, a reference model run at the same resolution is used. Prior to running the hydrological model, the biases in the climate model output are investigated and corrected for. Different approaches to correct the distributed climate model output using single-site observations are compared. Correcting the spatially averaged temperature and precipitation is found to give the best results, but still large differences exist between observations and simulations. The bias corrected data are then used to force HBV. Results indicate a small increase in overall discharge for especially the B1 scenario during the beginning of the 21st century. Towards the end of the century, all scenarios show a decrease in summer discharge, partially because of the diminished buffering effect by the snow pack, and an increased discharge in winter. It should be stressed, however, that we used results from only one GCM (the only one available at such a high resolution). It would be interesting to repeat the analysis with multiple models.


Author(s):  
Syed Rouhullah Ali ◽  
Junaid N. Khan ◽  
Mehraj U. Din Dar ◽  
Shakeel Ahmad Bhat ◽  
Syed Midhat Fazil ◽  
...  

Aims: The study aimed at modeling the climate change projections for Ferozpur subcatchment of Jhelum sub-basin of Kashmir Valley using the SDSM model. Study Design: The study was carried out in three different time slices viz Baseline (1985-2015), Mid-century (2030-2059) and End-century (2070-2099). Place and Duration of Study: Division of Agricultural Engineering, SKUAST-K, Shalimar between August 2015 and July 2016. Methodology: Statistical downscaling model (SDSM) was applied in downscaling weather files (Tmax, Tminand precipitation). The study includes the calibration of the SDSM model by using Observed daily climate data (Tmax, Tmin and precipitation) of thirty one years and large scale atmospheric variables encompassing National Centers for Environmental Prediction (NCEP) reanalysis data, the validation of the model, and the outputs of downscaled scenario A2 of the Global Climate Model (GCM) data of Hadley Centre Coupled Model, Version 3 (HadCM3) model for the future. Daily Climate (Tmax, Tmin and precipitation) scenarios were generated from 1961 to 2099 under A2 defined by Intergovernmental Panel on Climate Change (IPCC). Results: The results showed that temperature and precipitation would increase by 0.29°C, 255.38 mm (30.97%) in MC (Mid-century) (2030-2059); and 0.67oC and 233.28 mm (28.29%) during EC (End-century) (2070-2099), respectively. Conclusion: The climate projections for 21st century under A2 scenario indicated that both mean annual temperature and precipitation are showing an increasing trend.


Sign in / Sign up

Export Citation Format

Share Document