scholarly journals An update on global atmospheric ice estimates from satellite observations and reanalyses

2018 ◽  
Vol 18 (15) ◽  
pp. 11205-11219 ◽  
Author(s):  
David Ian Duncan ◽  
Patrick Eriksson

Abstract. This study assesses the global distribution of mean atmospheric ice mass from current state-of-the-art estimates and its variability on daily and seasonal timescales. Ice water path (IWP) retrievals from active and passive satellite platforms are analysed and compared with estimates from two reanalysis data sets, ERA5 (European Centre for Medium-range Weather Forecasts Reanalysis 5, ECMWF) and MERRA-2 (Modern-Era Retrospective Analysis for Research and Applications 2). Large discrepancies in IWP exist between the satellite data sets themselves, making validation of the model results problematic and indicating that progress towards a consensus on the distribution of atmospheric ice has been limited. Comparing the data sets, zonal means of IWP exhibit similar shapes but differing magnitudes, with large IWP values causing much of the difference in means. Diurnal analysis centred on A-Train overpasses shows similar structures in some regions, but the degree and sign of the variability varies widely; the reanalyses exhibit noisier and higher-amplitude diurnal variability than borne out by the satellite estimates. Spatial structures governed by the atmospheric general circulation are fairly consistent across the data sets, as principal component analysis shows that the patterns of seasonal variability line up well between the data sets but disagree in severity. These results underscore the limitations of the current Earth observing system with respect to atmospheric ice, as the level of consensus between observations is mixed. The large-scale variability of IWP is relatively consistent, whereas disagreements on diurnal variability and global means point to varying microphysical assumptions in retrievals and models alike that seem to underlie the biggest differences.

2018 ◽  
Author(s):  
David Ian Duncan ◽  
Patrick Eriksson

Abstract. This study assesses the global distribution of mean atmospheric ice mass from current state-of-the-art estimates and its variability on daily and seasonal timescales. Ice water path (IWP) retrievals from active and passive satellite platforms are compared and analysed against estimates from two reanalysis datasets, ERA5 (European Centre for Medium-range Weather Forecasts Reanalysis 5) and MERRA-2 (Modern-era Retrospective Reanalysis for Research and Applications 2). Large discrepancies in IWP exist between the satellite datasets themselves, making validation of the model results problematic and indicating that progress towards consensus on the distribution of atmospheric ice has been limited. Comparing the datasets, zonal means of IWP exhibit similar shapes but differing magnitudes. Diurnal analysis centred on A-Train overpasses shows homologous structures in some regions, but the degree and sign of the variability varies widely; the reanalyses exhibit noisier and higher amplitude diurnal variability than borne out by the satellite estimates. Spatial structures governed by the atmospheric general circulation are fairly consistent across the datasets, as principal component analysis shows that the patterns of seasonal variability line up well between the datasets but disagree in severity. These results underscore the limitations of the current Earth observing system with respect to atmospheric ice, as the level of consensus between observations is mixed. The large-scale variability of IWP is relatively consistent, whereas disagreements on diurnal variability and global means point to varying microphysical assumptions in retrievals and models alike that seem to underlie the biggest differences.


2016 ◽  
Vol 46 (12) ◽  
pp. 3751-3775 ◽  
Author(s):  
Olivier Arzel ◽  
Alain Colin de Verdière

AbstractThe turbulent diapycnal mixing in the ocean is currently obtained from microstructure and finestructure measurements, dye experiments, and inverse models. This study presents a new method that infers the diapycnal mixing from low-resolution numerical calculations of the World Ocean whose temperatures and salinities are restored to the climatology. At the difference of robust general circulation ocean models, diapycnal diffusion is not prescribed but inferred. At steady state the buoyancy equation shows an equilibrium between the large-scale diapycnal advection and the restoring terms that take the place of the divergence of eddy buoyancy fluxes. The geography of the diapycnal flow reveals a strong regional variability of water mass transformations. Positive values of the diapycnal flow indicate an erosion of a deep-water mass and negative values indicate a creation. When the diapycnal flow is upward, a diffusion law can be fitted in the vertical and the diapycnal eddy diffusivity is obtained throughout the water column. The basin averages of diapycnal diffusivities are small in the first 1500 m [O(10−5) m2 s−1] and increase downward with bottom values of about 2.5 × 10−4 m2 s−1 in all ocean basins, with the exception of the Southern Ocean (50°–30°S), where they reach 12 × 10−4 m2 s−1. This study confirms the small diffusivity in the thermocline and the robustness of the higher canonical Munk’s value in the abyssal ocean. It indicates that the upward dianeutral transport in the Atlantic mostly takes place in the abyss and the upper ocean, supporting the quasi-adiabatic character of the middepth overturning.


2007 ◽  
Vol 4 (5) ◽  
pp. 3413-3440 ◽  
Author(s):  
E. P. Maurer ◽  
H. G. Hidalgo

Abstract. Downscaling of climate model data is essential to most impact analysis. We compare two methods of statistical downscaling to produce continuous, gridded time series of precipitation and surface air temperature at a 1/8-degree (approximately 140 km² per grid cell) resolution over the western U.S. We use NCEP/NCAR Reanalysis data from 1950–1999 as a surrogate General Circulation Model (GCM). The two methods included are constructed analogues (CA) and a bias correction and spatial downscaling (BCSD), both of which have been shown to be skillful in different settings, and BCSD has been used extensively in hydrologic impact analysis. Both methods use the coarse scale Reanalysis fields of precipitation and temperature as predictors of the corresponding fine scale fields. CA downscales daily large-scale data directly and BCSD downscales monthly data, with a random resampling technique to generate daily values. The methods produce comparable skill in producing downscaled, gridded fields of precipitation and temperatures at a monthly and seasonal level. For daily precipitation, both methods exhibit some skill in reproducing both observed wet and dry extremes and the difference between the methods is not significant, reflecting the general low skill in daily precipitation variability in the reanalysis data. For low temperature extremes, the CA method produces greater downscaling skill than BCSD for fall and winter seasons. For high temperature extremes, CA demonstrates higher skill than BCSD in summer. We find that the choice of most appropriate downscaling technique depends on the variables, seasons, and regions of interest, on the availability of daily data, and whether the day to day correspondence of weather from the GCM needs to be reproduced for some applications. The ability to produce skillful downscaled daily data depends primarily on the ability of the climate model to show daily skill.


2017 ◽  
Vol 17 (2) ◽  
pp. 855-866 ◽  
Author(s):  
Leon S. Friedrich ◽  
Adrian J. McDonald ◽  
Gregory E. Bodeker ◽  
Kathy E. Cooper ◽  
Jared Lewis ◽  
...  

Abstract. Location information from long-duration super-pressure balloons flying in the Southern Hemisphere lower stratosphere during 2014 as part of X Project Loon are used to assess the quality of a number of different reanalyses including National Centers for Environmental Prediction Climate Forecast System version 2 (NCEP-CFSv2), European Centre for Medium-Range Weather Forecasts (ERA-Interim), NASA Modern Era Retrospective-Analysis for Research and Applications (MERRA), and the recently released MERRA version 2. Balloon GPS location information is used to derive wind speeds which are then compared with values from the reanalyses interpolated to the balloon times and locations. All reanalysis data sets accurately describe the winds, with biases in zonal winds of less than 0.37 m s−1 and meridional biases of less than 0.08 m s−1. The standard deviation on the differences between Loon and reanalyses zonal winds is latitude-dependent, ranging between 2.5 and 3.5 m s−1, increasing equatorward. Comparisons between Loon trajectories and those calculated by applying a trajectory model to reanalysis wind fields show that MERRA-2 wind fields result in the most accurate simulated trajectories with a mean 5-day balloon–reanalysis trajectory separation of 621 km and median separation of 324 km showing significant improvements over MERRA version 1 and slightly outperforming ERA-Interim. The latitudinal structure of the trajectory statistics for all reanalyses displays marginally lower mean separations between 15 and 35° S than between 35 and 55° S, despite standard deviations in the wind differences increasing toward the equator. This is shown to be related to the distance travelled by the balloon playing a role in the separation statistics.


Geophysics ◽  
2002 ◽  
Vol 67 (1) ◽  
pp. 204-211 ◽  
Author(s):  
Pascal Audigane ◽  
Jean‐Jacques Royer ◽  
Hideshi Kaieda

Hydraulic fracturing is a common procedure to increase the permeability of a reservoir. It consists in injecting high‐pressure fluid into pilot boreholes. These hydraulic tests induce locally seismic emission (microseismicity) from which large‐scale permeability estimates can be derived assuming a diffusion‐like process of the pore pressure into the surrounding stimulated rocks. Such a procedure is applied on six data sets collected in the vicinity of two geothermal sites at Soultz (France) and Ogachi (Japan). The results show that the method is adequate to estimate large‐scale permeability tensors at different depths in the reservoir. Such an approach provides permeability of the medium before fracturing compatible with in situ measurements. Using a line source formulation of the diffusion equation rather than a classical point source approach, improvements are proposed for accounting in situation where the injection is performed on a well section. This technique applied to successive fluid‐injection tests indicates an increase in permeability by an order of magnitude. The underestimates observed in some cases are attributed to the difference of scale at which the permeability is estimated (some 1 km3 corresponding to the seismic active volume of rock compared to a few meters around the well for the pumping or pressure oscillation tests). One advantage of the proposed method is that it provides permeability tensor estimates at the reservoir scale.


2015 ◽  
Vol 15 (7) ◽  
pp. 3873-3892 ◽  
Author(s):  
Z. D. Lawrence ◽  
G. L. Manney ◽  
K. Minschwaner ◽  
M. L. Santee ◽  
A. Lambert

Abstract. We present a comprehensive comparison of polar processing diagnostics derived from the National Aeronautics and Space Administration (NASA) Modern Era Retrospective-analysis for Research and Applications (MERRA) and the European Centre for Medium-Range Weather Forecasts (ECMWF) Interim Reanalysis (ERA-Interim). We use diagnostics that focus on meteorological conditions related to stratospheric chemical ozone loss based on temperatures, polar vortex dynamics, and air parcel trajectories to evaluate the effects these reanalyses might have on polar processing studies. Our results show that the agreement between MERRA and ERA-Interim changes significantly over the 34 years from 1979 to 2013 in both hemispheres and in many cases improves. By comparing our diagnostics during five time periods when an increasing number of higher-quality observations were brought into these reanalyses, we show how changes in the data assimilation systems (DAS) of MERRA and ERA-Interim affected their meteorological data. Many of our stratospheric temperature diagnostics show a convergence toward significantly better agreement, in both hemispheres, after 2001 when Aqua and GOES (Geostationary Operational Environmental Satellite) radiances were introduced into the DAS. Other diagnostics, such as the winter mean volume of air with temperatures below polar stratospheric cloud formation thresholds (VPSC) and some diagnostics of polar vortex size and strength, do not show improved agreement between the two reanalyses in recent years when data inputs into the DAS were more comprehensive. The polar processing diagnostics calculated from MERRA and ERA-Interim agree much better than those calculated from earlier reanalysis data sets. We still, however, see fairly large differences in many of the diagnostics in years prior to 2002, raising the possibility that the choice of one reanalysis over another could significantly influence the results of polar processing studies. After 2002, we see overall good agreement among the diagnostics, which demonstrates that the ERA-Interim and MERRA reanalyses are equally appropriate choices for polar processing studies of recent Arctic and Antarctic winters.


2014 ◽  
Vol 27 (1) ◽  
pp. 312-324 ◽  
Author(s):  
Jonathan M. Eden ◽  
Martin Widmann

Abstract Producing reliable estimates of changes in precipitation at local and regional scales remains an important challenge in climate science. Statistical downscaling methods are often utilized to bridge the gap between the coarse resolution of general circulation models (GCMs) and the higher resolutions at which information is required by end users. As the skill of GCM precipitation, particularly in simulating temporal variability, is not fully understood, statistical downscaling typically adopts a perfect prognosis (PP) approach in which high-resolution precipitation projections are based on real-world statistical relationships between large-scale atmospheric predictors and local-scale precipitation. Using a nudged simulation of the ECHAM5 GCM, in which the large-scale weather states are forced toward observations of large-scale circulation and temperature for the period 1958–2001, previous work has shown ECHAM5 skill in simulating temporal variability of precipitation to be high in many parts of the world. Here, the same nudged simulation is used in an alternative downscaling approach, based on model output statistics (MOS), in which statistical corrections are derived for simulated precipitation. Cross-validated MOS corrections based on maximum covariance analysis (MCA) and principal component regression (PCR), in addition to a simple local scaling, are shown to perform strongly throughout much of the extratropics. Correlation between downscaled and observed monthly-mean precipitation is as high as 0.8–0.9 in many parts of Europe, North America, and Australia. For these regions, MOS clearly outperforms PP methods that use temperature and circulation as predictors. The strong performance of MOS makes such an approach to downscaling attractive and potentially applicable to climate change simulations.


2020 ◽  
Author(s):  
André Brosowski ◽  
Ralf Bill ◽  
Daniela Thrän

Abstract Background: Half of the UN climate target for 2030 has been achieved and further progress requires swiftly implementable solutions. In this context, the fermentation of cereal straw is a promising option. Returning the digestate to the farmland can close agricultural cycles while simultaneously producing biomethane for the transport sector. The world's first large-scale, mono-digestion plant for straw is operational since 2014. The temporal and spatial biomass availability is a key issue when replicating this concept. No detailed calculations on this subject are available, and the strategic relevance of biomethane from straw in the transport sector cannot be sufficiently evaluated.Methods: To assess the balance of straw supply and use, a total of 30 data sets are combined, taking into account the cultivation of the five most important cereal types and the straw required for ten animal species, two special crops and twelve industrial uses. The data are managed at district level and presented for the years 2010 to 2018. In combination with high-resolution geodata, the results are linked to actual arable fields, and the availability of straw throughout the country is evaluated using a GIS.Results: During the analysis period, the mobilisable potential for future biomethane production is between 13.9–21.5 Tg fm a-1; this is up to 62 % higher than the previously known level. The annual potential fluctuates considerably due to weather anomalies. The all-time maximum in 2014 and the minimum for the last 26 years in 2018 are separated by just four years and a difference of 7.6 Tg fm. However, large parts of the potential are concentrated only in a few regions and liquefied biomethane could fully cover the fuel required for vessels, and up to a quarter of that for heavy goods vehicles. Up to 11.3 Tg CO2-eq. could be saved, reducing the difference to achieve the UN climate target by 3.7 %.Conclusion: Despite the strong fluctuations, the potential is sufficient to supply numerous plants and to produce relevant quantities of liquefied biomethane even in weak years. To unlock the potential, the outcomes should be discussed further with stakeholders in the identified priority regions.


2015 ◽  
Vol 12 (12) ◽  
pp. 12649-12701 ◽  
Author(s):  
J.-P. Vidal ◽  
B. Hingray ◽  
C. Magand ◽  
E. Sauquet ◽  
A. Ducharne

Abstract. This paper proposes a methodology for estimating the transient probability distribution of yearly hydrological variables conditional to an ensemble of projections built from multiple general circulation models (GCMs), multiple statistical downscaling methods (SDMs) and multiple hydrological models (HMs). The methodology is based on the quasi-ergodic analysis of variance (QE-ANOVA) framework that allows quantifying the contributions of the different sources of total uncertainty, by critically taking account of large-scale internal variability stemming from the transient evolution of multiple GCM runs, and of small-scale internal variability derived from multiple realizations of stochastic SDMs. The QE-ANOVA framework was initially developed for long-term climate averages and is here extended jointly to (1) yearly anomalies and (2) low flow variables. It is applied to better understand possible transient futures of both winter and summer low flows for two snow-influenced catchments in the southern French Alps. The analysis takes advantage of a very large dataset of transient hydrological projections that combines in a comprehensive way 11 runs from 4 different GCMs, 3 SDMs with 10 stochastic realizations each, as well as 6 diverse HMs. The change signal is a decrease in yearly low flows of around −20 % in 2065, except for the most elevated catchment in winter where low flows barely decrease. This signal is largely masked by both large- and small-scale internal variability, even in 2065. The time of emergence of the change signal on 30 year low-flow averages is however around 2035, i.e. for time slices starting in 2020. The most striking result is that a large part of the total uncertainty – and a higher one than that due to the GCMs – stems from the difference in HM responses. An analysis of the origin of this substantial divergence in HM responses for both catchments and in both seasons suggests that both evapotranspiration and snowpack components of HMs should be carefully checked for their robustness in a changed climate in order to provide reliable outputs for informing water resource adaptation strategies.


Sign in / Sign up

Export Citation Format

Share Document