scholarly journals Developing Gridded Climate Data Sets of Precipitation for Greece Based on Homogenized Time Series

Climate ◽  
2019 ◽  
Vol 7 (5) ◽  
pp. 68 ◽  
Author(s):  
Flora Gofa ◽  
Anna Mamara ◽  
Manolis Anadranistakis ◽  
Helena Flocas

The creation of realistic gridded precipitation fields improves our understanding of the observed climate and is necessary for validating climate model output for a wide range of applications. The challenge in trying to represent the highly variable nature of precipitation is to overcome the lack of density of observations in both time and space. Data sets of mean monthly and annual precipitations were developed for Greece in gridded format with an analysis of 30 arcsec (∼800 m) based on data from 1971 to 2000. One hundred and fifty-seven surface stations from two different observation networks were used to cover a satisfactory range of elevations. Station data were homogenized and subjected to quality control to represent changes in meteorological conditions rather than changes in the conditions under which the observations were made. The Meteorological Interpolation based on Surface Homogenized Data Basis (MISH) interpolation method was used to develop data sets that reproduce, as closely as possible, the spatial climate patterns over the region of interest. The main geophysical factors considered for the interpolation of mean monthly precipitation fields were elevation, latitude, incoming solar irradiance, Euclidian distance from the coastline, and land-to-sea percentage. Low precipitation interpolation uncertainties estimated with the cross-validation method provided confidence in the interpolation method. The resulting high-resolution maps give an overall realistic representation of precipitation, especially in fall and winter, with a clear longitudinal dependence on precipitation decreasing from western to eastern continental Greece.

2021 ◽  
Author(s):  
Hilde Nesse Tyssøy ◽  
Miriam Sinnhuber ◽  
Timo Asikainen ◽  
Stefan Bender ◽  
Mark A. Clilverd ◽  
...  

<p>Precipitating auroral and radiation belt electrons are considered an important part of the natural forcing of the climate system.  Recent studies suggest that this forcing is underestimated in current chemistry-climate models. The HEPPA III intercomparison experiment is a collective effort to address this point. Here, eight different estimates of medium energy electron (MEE) (>30 keV) ionization rates are assessed during a geomagnetic active period in April 2010.  The objective is to understand the potential uncertainty related to the MEE energy input. The ionization rates are all based on the Medium Energy Proton and Electron Detector (MEPED) on board the NOAA/POES and EUMETSAT/MetOp spacecraft series. However, different data handling, ionization rate calculations, and background atmospheres result in a wide range of mesospheric electron ionization rates. Although the eight data sets agree well in terms of the temporal variability, they differ by about an order of magnitude in ionization rate strength both during geomagnetic quiet and disturbed periods. The largest spread is found in the aftermath of the geomagnetic activity.  Furthermore, governed by different energy limits, the atmospheric penetration depth varies, and some differences related to latitudinal coverage are also evident. The mesospheric NO densities simulated with the Whole Atmospheric Community Climate Model driven by highest and lowest ionization rates differ by more than a factor of eight. In a follow-up study, the atmospheric responses are simulated in four chemistry-climate models and compared to satellite observations, considering both the model structure and the ionization forcing.</p>


2019 ◽  
Author(s):  
Cristian Lussana ◽  
Ole Einar Tveito ◽  
Andreas Dobler ◽  
Ketil Tunheim

Abstract. seNorge_2018 is a collection of observational gridded datasets over Norway for: daily total precipitation; daily mean, maximum and minimum temperatures. The time period covers 1957 to 2017, and the data are presented over a high-resolution terrain-following grid with 1 km spacing in both meridional and zonal directions. The seNorge family of observational gridded datasets developed at the Norwegian Meteorological Institute (MET Norway) has a twenty-year long history and seNorge_2018 is its newest member, the first providing daily minimum and maximum temperatures. seNorge datasets are used for a wide range of applications in climatology, hydrology and meteorology. The observational dataset is based on MET Norway's climate data, which has been integrated by the European Climate Assessment and Dataset database. Two distinct statistical interpolation methods have been developed, one for temperature and the other for precipitation. They are both based on a spatial scale-separation approach where, at first, the analysis (i.e., predictions) at larger spatial scales are estimated. Subsequently they are used to infer the small-scale details down to a spatial scale comparable to the local observation density. Mean, maximum and minimum temperatures are interpolated separately, then physical consistency among them is enforced. For precipitation, in addition to observational data, the spatial interpolation makes use of information provided by a climate model. The analysis evaluation is based on cross-validation statistics and comparison with a previous seNorge version. The analysis quality is presented as a function of the local station density. We show that the occurrence of large errors in the analyses decays at an exponential rate with the increase in the station density. Temperature analyses over most of the domain are generally not affected by significant biases. However, during wintertime in data-sparse regions the analyzed minimum temperatures do have a bias between 2 °C and 3 °C. Minimum temperatures are more challenging to represent and large errors are more frequent than for maximum and mean temperatures. The precipitation analysis quality depends crucially on station density: the frequency of occurrence of large errors for intense precipitation is less than 5 % in data-dense regions, while it is approximately 30 % in data-sparse regions. he open-access datasets are available20for public download at: daily total precipitation (DOI: https://doi.org/10.5281/zenodo.2082320, Lussana, 2018b); daily mean (DOI: https://doi.org/10.5281/zenodo.2023997, Lussana, 2018c) , maximum (DOI: https://doi.org/10.5281/zenodo.2559372, Lussana, 2018e) and minimum (DOI: https://doi.org/10.5281/zenodo.2559354, Lussana, 2018d) temperatures.


2015 ◽  
Vol 15 (7) ◽  
pp. 10085-10122 ◽  
Author(s):  
C. McLandress ◽  
T. G. Shepherd ◽  
A. I. Jonsson ◽  
T. von Clarmann ◽  
B. Funke

Abstract. A method is proposed for merging different nadir-sounding climate data records using measurements from high resolution limb sounders to provide a transfer function between the different nadir measurements. The nadir-sounding records need not be overlapping so long as the limb-sounding record bridges between them. The method is applied to global mean stratospheric temperatures from the NOAA Climate Data Records based on the Stratospheric Sounding Unit (SSU) and the Advanced Microwave Sounding Unit-A (AMSU), extending the SSU record forward in time to yield a continuous data set from 1979 to present. SSU and AMSU are bridged using temperature measurements from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), which is of high enough vertical resolution to accurately represent the weighting functions of both SSU and AMSU. For this application, a purely statistical approach is not viable since the different nadir channels are not sufficiently linearly independent, statistically speaking. The extended SSU global-mean data set is in good agreement with temperatures from the Microwave Limb Sounder (MLS) on the Aura satellite, with both exhibiting a cooling trend of ~ 0.6 ± 0.3 K decade−1 in the upper stratosphere from 2004–2012. The extended SSU data set also compares well with chemistry-climate model simulations over its entire record, including the contrast between the weak cooling seen over 1995–2004 compared with the large cooling seen in the period 1986–1995 of strong ozone depletion.


Data ◽  
2019 ◽  
Vol 4 (2) ◽  
pp. 72 ◽  
Author(s):  
Abhishek Gaur ◽  
Michael Lacasse ◽  
Marianne Armstrong

Buildings and homes in Canada will be exposed to unprecedented climatic conditions in the future as a consequence of global climate change. To improve the climate resiliency of existing and new buildings, it is important to evaluate their performance over current and projected future climates. Hygrothermal and whole building simulation models, which are important tools for assessing performance, require continuous climate records at high temporal frequencies of a wide range of climate variables for input into the kinds of models that relate to solar radiation, cloud-cover, wind, humidity, rainfall, temperature, and snow-cover. In this study, climate data that can be used to assess the performance of building envelopes under current and projected future climates, concurrent with 2 °C and 3.5 °C increases in global temperatures, are generated for 11 major Canadian cities. The datasets capture the internal variability of the climate as they are comprised of 15 realizations of the future climate generated by dynamically downscaling future projections from the CanESM2 global climate model and thereafter bias-corrected with reference to observations. An assessment of the bias-corrected projections suggests, as a consequence of global warming, future increases in the temperatures and precipitation, and decreases in the snow-cover and wind-speed for all cities.


2011 ◽  
Vol 12 (1) ◽  
pp. 84-100 ◽  
Author(s):  
Csaba Torma ◽  
Erika Coppola ◽  
Filippo Giorgi ◽  
Judit Bartholy ◽  
Rita Pongrácz

Abstract This paper presents a validation study for a high-resolution version of the Regional Climate Model version 3 (RegCM3) over the Carpathian basin and its surroundings. The horizontal grid spacing of the model is 10 km—the highest reached by RegCM3. The ability of the model to capture temporal and spatial variability of temperature and precipitation over the region of interest is evaluated using metrics spanning a wide range of temporal (daily to climatology) and spatial (inner domain average to local) scales against different observational datasets. The simulated period is 1961–90. RegCM3 shows small temperature biases but a general overestimation of precipitation, especially in winter; although, this overestimate may be artificially enhanced by uncertainties in observations. The precipitation bias over the Hungarian territory, the authors’ main area of interest, is mostly less than 20%. The model captures well the observed late twentieth-century decadal-to-interannual and interseasonal variability. On short time scales, simulated daily temperature and precipitation show a high correlation with observations, with a correlation coefficient of 0.9 for temperature and 0.6 for precipitation. Comparison with two Hungarian station time series shows that the model performance does not degrade when going to the 10-km gridpoint scale. Finally, the model reproduces the spatial distribution of dry and wet spells over the region. Overall, it is assessed that this high-resolution version of RegCM3 is of sufficiently good quality to perform climate change experiments over the Carpathian region—and, in particular, the Hungarian territory—for application to impact and adaptation studies.


2005 ◽  
Vol 5 (6) ◽  
pp. 1557-1576 ◽  
Author(s):  
T. Egorova ◽  
E. Rozanov ◽  
V. Zubov ◽  
E. Manzini ◽  
W. Schmutz ◽  
...  

Abstract. In this paper we document "SOCOL", a new chemistry-climate model, which has been ported for regular PCs and shows good wall-clock performance. An extensive validation of the model results against present-day climate data obtained from observations and assimilation data sets shows that the model describes the climatological state of the atmosphere for the late 1990s with reasonable accuracy. The model has a significant temperature bias only in the upper stratosphere and near the tropopause at high latitudes. The latter is the result of the rather low vertical resolution of the model near the tropopause. The former can be attributed to a crude representation of radiation heating in the middle atmosphere. A comparison of the simulated and observed link between the tropical stratospheric structure and the strength of the polar vortex shows that in general, both observations and simulations reveal a higher temperature and ozone mixing ratio in the lower tropical stratosphere for the case with stronger Polar night jet (PNJ) and slower Brewer-Dobson circulation as predicted by theoretical studies.


2019 ◽  
Vol 11 (4) ◽  
pp. 1531-1551 ◽  
Author(s):  
Cristian Lussana ◽  
Ole Einar Tveito ◽  
Andreas Dobler ◽  
Ketil Tunheim

Abstract. seNorge_2018 is a collection of observational gridded datasets over Norway for daily total precipitation: daily mean, maximum, and minimum temperatures. The time period covers 1957 to 2017, and the data are presented over a high-resolution terrain-following grid with 1 km spacing in both meridional and zonal directions. The seNorge family of observational gridded datasets developed at the Norwegian Meteorological Institute (MET Norway) has a 20-year-long history and seNorge_2018 is its newest member, the first providing daily minimum and maximum temperatures. seNorge datasets are used for a wide range of applications in climatology, hydrology, and meteorology. The observational dataset is based on MET Norway's climate data, which have been integrated by the “European Climate Assessment and Dataset” database. Two distinct statistical interpolation methods have been developed, one for temperature and the other for precipitation. They are both based on a spatial scale-separation approach where, at first, the analysis (i.e., predictions) at larger spatial scales is estimated. Subsequently they are used to infer the small-scale details down to a spatial scale comparable to the local observation density. Mean, maximum, and minimum temperatures are interpolated separately; then physical consistency among them is enforced. For precipitation, in addition to observational data, the spatial interpolation makes use of information provided by a climate model. The analysis evaluation is based on cross-validation statistics and comparison with a previous seNorge version. The analysis quality is presented as a function of the local station density. We show that the occurrence of large errors in the analyses decays at an exponential rate with the increase in the station density. Temperature analyses over most of the domain are generally not affected by significant biases. However, during wintertime in data-sparse regions the analyzed minimum temperatures do have a bias between 2 ∘C and 3 ∘C. Minimum temperatures are more challenging to represent and large errors are more frequent than for maximum and mean temperatures. The precipitation analysis quality depends crucially on station density: the frequency of occurrence of large errors for intense precipitation is less than 5% in data-dense regions, while it is approximately 30 % in data-sparse regions. The open-access datasets are available for public download at daily total precipitation (https://doi.org/10.5281/zenodo.2082320, Lussana, 2018b); and daily mean (https://doi.org/10.5281/zenodo.2023997, Lussana, 2018c), maximum (https://doi.org/10.5281/zenodo.2559372, Lussana, 2018e), and minimum (https://doi.org/10.5281/zenodo.2559354, Lussana, 2018d) temperatures.


2018 ◽  
Author(s):  
Poerbandono ◽  
Philip J. Ward ◽  
Miga Magenika Julian

This paper discusses a study of application of global spatio-temporal climate data sets and a hydrological model operated in Spatial Tools for River Basin Environmental Analysis and Management (STREAM). The study investigates reconstruction of monthly hydrographs across several selected points of the western part of Java, Indonesia for the period 1983-2002. Prior to the reconstruction, set up and calibration are carried out. The set up includes preparation of monthly precipitation and temperature data set, digital elevation model of the domain being studied and their compilation with land cover map. Discharge observations from six stations located mostly at the upper parts of major watersheds in the domain are used to calibrate the model. It is found that the model performs results with acceptable agreement. Comparison between computed and observed monthly average discharges correlate quite well with coefficient ranging from 0.72 to 0.93. The accuracy of computed total annual average discharge in five out of six observation stations is within the range of 7%. Optimum setting of calibration parameters is discovered. This study offers scheme for reconstructing historical discharge in paleo-climate perspective and future scenario for predicting local effect of global climate change, given the predicted climate data sets and geographic setting (i.e. topography and land cover).


2016 ◽  
Vol 55 (11) ◽  
pp. 2411-2430 ◽  
Author(s):  
Megan C. Kirchmeier-Young ◽  
David J. Lorenz ◽  
Daniel J. Vimont

AbstractExtreme events are important to many studying regional climate impacts but provide a challenge for many “deterministic” downscaling methodologies. The University of Wisconsin Probabilistic Downscaling (UWPD) dataset applies a “probabilistic” approach to downscaling that may be advantageous in a number of situations, including realistic representation of extreme events. The probabilistic approach to downscaling, however, presents some unique challenges for verification, especially when comparing a full probability density function with a single observed value for each day. Furthermore, because of the wide range of specific climatic information needed in climate impacts assessment, any single verification metric will be useful to only a limited set of practitioners. The intent of this study, then, is (i) to identify verification metrics appropriate for probabilistic downscaling of climate data; (ii) to apply, within the UWPD, those metrics to a suite of extreme event statistics that may be of use in climate impacts assessments; and (iii) in applying these metrics, to demonstrate the utility of a probabilistic approach to downscaling climate data, especially for representing extreme events.


2014 ◽  
Vol 27 (7) ◽  
pp. 2667-2681 ◽  
Author(s):  
Andrew T. Wittenberg ◽  
Anthony Rosati ◽  
Thomas L. Delworth ◽  
Gabriel A. Vecchi ◽  
Fanrong Zeng

Abstract Observations and climate simulations exhibit epochs of extreme El Niño–Southern Oscillation (ENSO) behavior that can persist for decades. Previous studies have revealed a wide range of ENSO responses to forcings from greenhouse gases, aerosols, and orbital variations, but they have also shown that interdecadal modulation of ENSO can arise even without such forcings. The present study examines the predictability of this intrinsically generated component of ENSO modulation, using a 4000-yr unforced control run from a global coupled GCM [GFDL Climate Model, version 2.1 (CM2.1)] with a fairly realistic representation of ENSO. Extreme ENSO epochs from the unforced simulation are reforecast using the same (“perfect”) model but slightly perturbed initial conditions. These 40-member reforecast ensembles display potential predictability of the ENSO trajectory, extending up to several years ahead. However, no decadal-scale predictability of ENSO behavior is found. This indicates that multidecadal epochs of extreme ENSO behavior can arise not only intrinsically but also delicately and entirely at random. Previous work had shown that CM2.1 generates strong, reasonably realistic, decadally predictable high-latitude climate signals, as well as tropical and extratropical decadal signals that interact with ENSO. However, those slow variations appear not to lend significant decadal predictability to this model’s ENSO behavior, at least in the absence of external forcings. While the potential implications of these results are sobering for decadal predictability, they also offer an expedited approach to model evaluation and development, in which large ensembles of short runs are executed in parallel, to quickly and robustly evaluate simulations of ENSO. Further implications are discussed for decadal prediction, attribution of past and future ENSO variations, and societal vulnerability.


Sign in / Sign up

Export Citation Format

Share Document