A High-Resolution Climate Model for the U.S. Pacific Northwest: Mesoscale Feedbacks and Local Responses to Climate Change*

2008 ◽  
Vol 21 (21) ◽  
pp. 5708-5726 ◽  
Author(s):  
Eric P. Salathé ◽  
Richard Steed ◽  
Clifford F. Mass ◽  
Patrick H. Zahn

Abstract Simulations of future climate scenarios produced with a high-resolution climate model show markedly different trends in temperature and precipitation over the Pacific Northwest than in the global model in which it is nested, apparently because of mesoscale processes not being resolved at coarse resolution. Present-day (1990–99) and future (2020–29, 2045–54, and 2090–99) conditions are simulated at high resolution (15-km grid spacing) using the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5) system and forced by ECHAM5 global simulations. Simulations use the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A2 emissions scenario, which assumes a rapid increase in greenhouse gas concentrations. The mesoscale simulations produce regional alterations in snow cover, cloudiness, and circulation patterns associated with interactions between the large-scale climate change and the regional topography and land–water contrasts. These changes substantially alter the temperature and precipitation trends over the region relative to the global model result or statistical downscaling. Warming is significantly amplified through snow–albedo feedback in regions where snow cover is lost. Increased onshore flow in the spring reduces the daytime warming along the coast. Precipitation increases in autumn are amplified over topography because of changes in the large-scale circulation and its interaction with the terrain. The robustness of the modeling results is established through comparisons with the observed and simulated seasonal variability and with statistical downscaling results.

2014 ◽  
Vol 15 (4) ◽  
pp. 1517-1531 ◽  
Author(s):  
Gerhard Smiatek ◽  
Harald Kunstmann ◽  
Andreas Heckl

Abstract The impact of climate change on the future water availability of the upper Jordan River (UJR) and its tributaries Dan, Snir, and Hermon located in the eastern Mediterranean is evaluated by a highly resolved distributed approach with the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5) run at 18.6- and 6.2-km resolution offline coupled with the Water Flow and Balance Simulation Model (WaSiM). The MM5 was driven with NCEP reanalysis for 1971–2000 and with Hadley Centre Coupled Model, version 3 (HadCM3), GCM forcings for 1971–2099. Because only one regional–global climate model combination was applied, the results may not give the full range of possible future projections. To describe the Dan spring behavior, the hydrological model was extended by a bypass approach to allow the fast discharge components of the Snir to enter the Dan catchment. Simulation results for the period 1976–2000 reveal that the coupled system was able to reproduce the observed discharge rates in the partially karstic complex terrain to a reasonable extent with the high-resolution 6.2-km meteorological input only. The performed future climate simulations show steadily rising temperatures with 2.2 K above the 1976–2000 mean for the period 2031–60 and 3.5 K for the period 2070–99. Precipitation trends are insignificant until the middle of the century, although a decrease of approximately 12% is simulated. For the end of the century, a reduction in rainfall ranging between 10% and 35% can be expected. Discharge in the UJR is simulated to decrease by 12% until 2060 and by 26% until 2099, both related to the 1976–2000 mean. The discharge decrease is associated with a lower number of high river flow years.


2018 ◽  
Vol 99 (4) ◽  
pp. 791-803 ◽  
Author(s):  
John R. Lanzante ◽  
Keith W. Dixon ◽  
Mary Jo Nath ◽  
Carolyn E. Whitlock ◽  
Dennis Adams-Smith

AbstractStatistical downscaling (SD) is commonly used to provide information for the assessment of climate change impacts. Using as input the output from large-scale dynamical climate models and observation-based data products, SD aims to provide a finer grain of detail and to mitigate systematic biases. It is generally recognized as providing added value. However, one of the key assumptions of SD is that the relationships used to train the method during a historical period are unchanged in the future, in the face of climate change. The validity of this assumption is typically quite difficult to assess in the normal course of analysis, as observations of future climate are lacking. We approach this problem using a “perfect model” experimental design in which high-resolution dynamical climate model output is used as a surrogate for both past and future observations.We find that while SD in general adds considerable value, in certain well-defined circumstances it can produce highly erroneous results. Furthermore, the breakdown of SD in these contexts could not be foreshadowed during the typical course of evaluation based on only available historical data. We diagnose and explain the reasons for these failures in terms of physical, statistical, and methodological causes. These findings highlight the need for caution in the use of statistically downscaled products and the need for further research to consider other hitherto unknown pitfalls, perhaps utilizing more advanced perfect model designs than the one we have employed.


2012 ◽  
Vol 13 (6) ◽  
pp. 1925-1938 ◽  
Author(s):  
Rogier van der Velde ◽  
Mhd. Suhyb Salama ◽  
Marcel D. van Helvoirt ◽  
Zhongbo Su ◽  
Yaoming Ma

Abstract Understanding the sources of uncertainty that cause deviations between simulated and satellite-observed states can facilitate optimal usage of these products via data assimilation or calibration techniques. A method is presented for separating uncertainties following from (i) scale differences between model grid and satellite footprint, (ii) residuals inherent to imperfect model and retrieval applications, and (iii) biases in the climatologies of simulations and retrievals. The method is applied to coarse (10 km) soil moisture simulations by the fifth-generation Pennsylvania State University–National Center for Atmospheric Research Mesoscale Model (MM5)–Noah regional climate model and 2.5 years of high-resolution (100 m) retrievals from the Advanced Synthetic Aperture Radar (ASAR) data collected over central Tibet. Suppression of the bias is performed via cumulative distribution function (CDF) matching. The other deviations are separated by taking the variance of the ASAR soil moisture at the coarse MM5 model grid as measure for the deviations caused by scale differences. Via decomposition of the uncertainty sources it is shown that the bias and the spatial-scale difference explain the majority (>70%) of the deviations between the two products, whereas the contribution of model–observation residuals is less than 30% on a monthly basis. Consequently, this study demonstrates that accounting for uncertainties caused by bias as well as spatial-scale difference is imperative for meaningful assimilation of high-resolution soil moisture products. On the other hand, the large uncertainties following from spatial-scale differences suggests that high-resolution soil moisture products have a potential of providing observation-based input for the subgrid spatial variability parameterizations within large-scale models.


2018 ◽  
Vol 11 (1) ◽  
pp. 453-466
Author(s):  
Aurélien Quiquet ◽  
Didier M. Roche ◽  
Christophe Dumas ◽  
Didier Paillard

Abstract. This paper presents the inclusion of an online dynamical downscaling of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km  ×  40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the downscaling can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.


2018 ◽  
Vol 31 (6) ◽  
pp. 2093-2113 ◽  
Author(s):  
Justin R. Minder ◽  
Theodore W. Letcher ◽  
Changhai Liu

The character and causes of elevation-dependent warming (EDW) of surface temperatures are examined in a suite of high-resolution ([Formula: see text] km) regional climate model (RCM) simulations of climate change over the Rocky Mountains using the Weather Research and Forecasting Model. A clear EDW signal is found over the region, with warming enhanced in certain elevation bands by as much as 2°C. During some months warming maximizes at middle elevations, whereas during others it increases monotonically with elevation or is nearly independent of elevation. Simulated EDW is primarily caused by the snow albedo feedback (SAF). Warming maximizes in regions of maximum snow loss and albedo reduction. The role of the SAF is confirmed by sensitivity experiments wherein the SAF is artificially suppressed. The elevation dependence of free-tropospheric warming appears to play a secondary role in shaping EDW. No evidence is found for a contribution from elevation-dependent water vapor feedbacks. Sensitivity experiments show that EDW depends strongly on certain aspects of RCM configuration. Simulations using 4- and 12-km horizontal grid spacings show similar EDW signals, but substantial differences are found when using a grid spacing of 36 km due to the influence of terrain resolution on snow cover and the SAF. Simulations using the Noah and Noah-MP land surface models (LSMs) exhibit large differences in EDW. These are caused by differences between LSMs in their representations of midelevation snow extent and in their parameterization of subpixel fractional snow cover. These lead to albedo differences that act to modulate the simulated SAF and its effect on EDW.


Author(s):  
Syed Rouhullah Ali ◽  
Junaid N. Khan ◽  
Mehraj U. Din Dar ◽  
Shakeel Ahmad Bhat ◽  
Syed Midhat Fazil ◽  
...  

Aims: The study aimed at modeling the climate change projections for Ferozpur subcatchment of Jhelum sub-basin of Kashmir Valley using the SDSM model. Study Design: The study was carried out in three different time slices viz Baseline (1985-2015), Mid-century (2030-2059) and End-century (2070-2099). Place and Duration of Study: Division of Agricultural Engineering, SKUAST-K, Shalimar between August 2015 and July 2016. Methodology: Statistical downscaling model (SDSM) was applied in downscaling weather files (Tmax, Tminand precipitation). The study includes the calibration of the SDSM model by using Observed daily climate data (Tmax, Tmin and precipitation) of thirty one years and large scale atmospheric variables encompassing National Centers for Environmental Prediction (NCEP) reanalysis data, the validation of the model, and the outputs of downscaled scenario A2 of the Global Climate Model (GCM) data of Hadley Centre Coupled Model, Version 3 (HadCM3) model for the future. Daily Climate (Tmax, Tmin and precipitation) scenarios were generated from 1961 to 2099 under A2 defined by Intergovernmental Panel on Climate Change (IPCC). Results: The results showed that temperature and precipitation would increase by 0.29°C, 255.38 mm (30.97%) in MC (Mid-century) (2030-2059); and 0.67oC and 233.28 mm (28.29%) during EC (End-century) (2070-2099), respectively. Conclusion: The climate projections for 21st century under A2 scenario indicated that both mean annual temperature and precipitation are showing an increasing trend.


2021 ◽  
Vol 26 (1) ◽  
pp. 16-27
Author(s):  
Dibas Shrestha ◽  
Shankar Sharma ◽  
Sandeep Bhandari ◽  
Rashila Deshar

Understanding the present and future spatial and temporal variations of precipitation and temperature is important for monitoring climate-induced disasters. Satellite and global reanalysis data can provide evenly distributed climate data; however, they are still too coarse to resolve fundamental processes over complex terrains. The study applies global climate model CGCM4/CANESM2, to project future maximum temperature, minimum temperature, and precipitation across the cross-section of the Gandaki River basin, Nepal. Large scale atmospheric variables of the National Centre for Environmental Prediction/National Centre for Atmospheric Research reanalysis (NCEP/NCAR) datasets are downscaled using Statistical Downscaling Model (SDSM) under different emission scenarios. For the variability and changes in maximum temperature (Tmax), minimum temperature (Tmin), and precipitation for future periods (2020s, 2050s, and 2080s), three different scenarios RCP2.6, RC4.5, and RCP8.5 of CGCM4 model were performed. The study revealed that both the temperature and precipitation would increase for three RCPs (representative concentration pathways) in the future. The highest increase in precipitation was found in the arid region compared to humid and sub-humid regions by the end of 2100. Similarly, the increase in mean monthly Tmin and Tmax was more pronounced in Jomsom station than Baglung and Dumkauli stations. Overall, a decrease in summer temperature and increase in winter temperature was expected for future periods across all regions. Further, spatial consistency was observed for Tmax and Tmin, whereas spatial consistency was not found for precipitation.


2019 ◽  
Vol 58 (11) ◽  
pp. 2387-2403 ◽  
Author(s):  
Zhenyu Han ◽  
Ying Shi ◽  
Jia Wu ◽  
Ying Xu ◽  
Botao Zhou

AbstractHigh-resolution combined dynamical and statistical downscaling for multivariables (HDM) was performed in the Beijing–Tianjin–Hebei (BTH) region by using observations from China Meteorological Administration Land Data Assimilation System (CLDAS), a regional climate model (RCM), and quantile mapping. This resulted in the production of a daily product with six variables (daily mean, maximum, and minimum temperature; precipitation; relative humidity; and wind speed), five ensemble members, a multidecadal time span (1980–2099), and a high resolution (6.25 km) for climate change projections under the RCP4.5 scenario. The evaluation showed that the HDM output could reproduce well the mean states of all variables and most extreme indices except the consecutive dry and wet days. The biases in the magnitude of interannual variability in HDM were mostly inherited from the RCM. By using the HDM, future projection over BTH was conducted. The results indicated that the annual mean temperature and precipitation as well as extreme heat and heavy precipitation events will increase over most regions. The warming magnitudes over the mountainous and coastal area at the northern BTH and the wetting magnitudes over the Daqinghe River basin (DRB) within BTH will be relatively stronger. The increases in extreme heat events will be much larger in the plain area. More than one-half of regions with the large extreme precipitation increase will be located within DRB. Both the number of models with the same sign of change and the ensemble standard deviation were used to estimate the projection uncertainty. The projected changes and uncertainties over DRB and subregions and Xiong’an city within the basin for each season are also discussed.


2011 ◽  
Vol 24 (11) ◽  
pp. 2680-2692 ◽  
Author(s):  
David Masson ◽  
Reto Knutti

Abstract About 20 global climate models have been run for the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) to predict climate change due to anthropogenic activities. Evaluating these models is an important step to establish confidence in climate projections. Model evaluation, however, is often performed on a gridpoint basis despite the fact that models are known to often be unreliable at such small spatial scales. In this study, the annual mean values of surface air temperature and precipitation are analyzed. Using a spatial smoothing technique with a variable-scale parameter it is shown that the intermodel spread, as well as model errors from observations, is reduced as the characteristic smoothing scale increases. At the same time, the ability to reproduce small-scale features is reduced and the simulated patterns become fuzzy. Depending on the variable of interest, the location, and the way that data are aggregated, different optimal smoothing scales from the gridpoint size to about 2000 km are found to give good agreement with present-day observation yet retain most regional features of the climate signal. Higher model resolution surprisingly does not imply much better agreement with temperature observations, in particular with stronger smoothing, and resolving smaller scales therefore does not necessarily seem to improve the simulation of large-scale climate features. Similarities in mean temperature and precipitation fields for a pair of models in the ensemble persist locally for about a century into the future, providing some justification for subtracting control errors in the models. Large-scale to global errors, however, are not well preserved over time, consistent with a poor constraint of the present-day climate on the simulated global temperature and precipitation response.


2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


Sign in / Sign up

Export Citation Format

Share Document