Satellite-derived Indian methane emission sources with TROPOMI retrievals and a high-resolution modelling framework: Initial comparison with WRF-GHG model results

Author(s):  
Dhanyalekshmi Pillai ◽  
Monish Deshpande ◽  
Julia Marshall ◽  
Christoph Gerbig ◽  
Oliver Schneising ◽  
...  

<p>In accordance with the Global stocktake under Article 14 of Paris Agreement, each county estimates its own greenhouse gas (GHG) emissions based on standardised bottom-up management methods. However, the accuracy of these methods along with the standards differs from country to country, resulting in large uncertainties that make it difficult to implement effective climate change mitigation strategies. India plays an important role in global methane emission scenario, necessitating the accurate quantification of its sources at the regional and the local levels. However, the country lacks sufficient long term, continuous and accurate observations of the atmospheric methane which are required to quantify its source, to understand changes in the carbon cycle and the climate system. Recent technological advancements in the use of satellite remote-sensing dedicated to the greenhouse gases enforce international standards for the observation methods; hence enabling those high-resolution-high-density observations to be utilised for this quantification purpose. This study focuses on exploring the use of such dedicated observations of the column-averaged dry-air mixing ratio of methane (XCH<sub>4</sub>) retrieved from TROPOMI onboard Sentinel-5 Precursor to quantify the major CH<sub>4</sub> anthropogenic and natural emission fluxes over India.</p><p>Our inverse modelling approach at the mesoscale includes a high-resolution atmospheric modelling framework consisting of the Weather Research and Forecasting model with greenhouse gas module (WRF-GHG) and a set of prior emission inventory model data. We use TROPOMI retrievals derived using the Weighting Function Modified Differential Optical Absorption Spectroscopy (WFM-DOAS) retrieval algorithm. WRF-GHG simulations are performed in hourly time intervals at a horizontal resolution of 10 km ×10 km for a month. In order to compare our CH<sub>4 </sub>simulations with the satellite column data, we have also taken into account the different vertical sensitivities of the instrument by applying the averaging kernel to the model simulations. To demonstrate the model performance, our simulations are also compared with the CAMS reanalysis product based on ECMWF (European Centre for Medium-Range Weather Forecasts) numerical weather prediction reanalysis data available at a horizontal resolution of 0.25<sup>o</sup> × 0.25<sup>o</sup>. Our comparison of these modelling results against unique satellite dataset indicates high potential of using TROPOMI retrievals in distinguishing the major CH<sub>4</sub> anthropogenic and natural sources over India via inverse modelling. The results will help to objectively investigate the claims of emission reductions and the efficiency of reduction countermeasures, as well as the establishment of standards and advancement of technology. The details about our approach and preliminary results based on our analysis using above satellite measurements and WRF-GHG simulations over India will be presented.  </p><p> </p>

2014 ◽  
Vol 7 (6) ◽  
pp. 1723-1744 ◽  
Author(s):  
B. Dils ◽  
M. Buchwitz ◽  
M. Reuter ◽  
O. Schneising ◽  
H. Boesch ◽  
...  

Abstract. Column-averaged dry-air mole fractions of carbon dioxide and methane have been retrieved from spectra acquired by the TANSO-FTS (Thermal And Near-infrared Sensor for carbon Observations-Fourier Transform Spectrometer) and SCIAMACHY (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) instruments on board GOSAT (Greenhouse gases Observing SATellite) and ENVISAT (ENVIronmental SATellite), respectively, using a range of European retrieval algorithms. These retrievals have been compared with data from ground-based high-resolution Fourier transform spectrometers (FTSs) from the Total Carbon Column Observing Network (TCCON). The participating algorithms are the weighting function modified differential optical absorption spectroscopy (DOAS) algorithm (WFMD, University of Bremen), the Bremen optimal estimation DOAS algorithm (BESD, University of Bremen), the iterative maximum a posteriori DOAS (IMAP, Jet Propulsion Laboratory (JPL) and Netherlands Institute for Space Research algorithm (SRON)), the proxy and full-physics versions of SRON's RemoTeC algorithm (SRPR and SRFP, respectively) and the proxy and full-physics versions of the University of Leicester's adaptation of the OCO (Orbiting Carbon Observatory) algorithm (OCPR and OCFP, respectively). The goal of this algorithm inter-comparison was to identify strengths and weaknesses of the various so-called round- robin data sets generated with the various algorithms so as to determine which of the competing algorithms would proceed to the next round of the European Space Agency's (ESA) Greenhouse Gas Climate Change Initiative (GHG-CCI) project, which is the generation of the so-called Climate Research Data Package (CRDP), which is the first version of the Essential Climate Variable (ECV) "greenhouse gases" (GHGs). For XCO2, all algorithms reach the precision requirements for inverse modelling (< 8 ppm), with only WFMD having a lower precision (4.7 ppm) than the other algorithm products (2.4–2.5 ppm). When looking at the seasonal relative accuracy (SRA, variability of the bias in space and time), none of the algorithms have reached the demanding < 0.5 ppm threshold. For XCH4, the precision for both SCIAMACHY products (50.2 ppb for IMAP and 76.4 ppb for WFMD) fails to meet the < 34 ppb threshold for inverse modelling, but note that this work focusses on the period after the 2005 SCIAMACHY detector degradation. The GOSAT XCH4 precision ranges between 18.1 and 14.0 ppb. Looking at the SRA, all GOSAT algorithm products reach the < 10 ppm threshold (values ranging between 5.4 and 6.2 ppb). For SCIAMACHY, IMAP and WFMD have a SRA of 17.2 and 10.5 ppb, respectively.


2015 ◽  
Vol 16 (4) ◽  
pp. 1843-1856 ◽  
Author(s):  
Silvio Davolio ◽  
Francesco Silvestro ◽  
Piero Malguzzi

Abstract Coupling meteorological and hydrological models is a common and standard practice in the field of flood forecasting. In this study, a numerical weather prediction (NWP) chain based on the BOLogna Limited Area Model (BOLAM) and the MOdello LOCale in Hybrid coordinates (MOLOCH) was coupled with the operational hydrological forecasting chain of the Ligurian Hydro-Meteorological Functional Centre to simulate two major floods that occurred during autumn 2011 in northern Italy. Different atmospheric simulations were performed by varying the grid spacing (between 1.0 and 3.0 km) of the high-resolution meteorological model and the set of initial/boundary conditions driving the NWP chain. The aim was to investigate the impact of these parameters not only from a meteorological perspective, but also in terms of discharge predictions for the two flood events. The operational flood forecasting system was thus used as a tool to validate in a more pragmatic sense the quantitative precipitation forecast obtained from different configurations of the NWP system. The results showed an improvement in flood prediction when a high-resolution grid was employed for atmospheric simulations. In turn, a better description of the evolution of the precipitating convective systems was beneficial for the hydrological prediction. Although the simulations underestimated the severity of both floods, the higher-resolution model chain would have provided useful information to the decision-makers in charge of protecting citizens.


2020 ◽  
Author(s):  
Xavier Lapillonne ◽  
William Sawyer ◽  
Philippe Marti ◽  
Valentin Clement ◽  
Remo Dietlicher ◽  
...  

&lt;p&gt;The ICON modelling framework is a unified numerical weather and climate model used for applications ranging from operational numerical weather prediction to low and high resolution climate projection. In view of further pushing the frontier of possible applications and to make use of the latest evolution in hardware technologies, parts of the model were recently adapted to run on heterogeneous GPU system. This initial GPU port focus on components required for high-resolution climate application, and allow considering multi-years simulations at 2.8 km on the Piz Daint heterogeneous supercomputer. These simulations are planned as part of the QUIBICC project &amp;#8220;The Quasi-Biennial Oscillation (QBO) in a changing climate&amp;#8221;, which propose to investigate effects of climate change on the dynamics of the QBO.&lt;/p&gt;&lt;p&gt;Because of the low compute intensity of atmospheric model the cost of data transfer between CPU and GPU at every step of the time integration would be prohibitive if only some components would be ported to the accelerator. We therefore present a full port strategy where all components required for the simulations are running on the GPU. For the dynamics, most of the physical parameterizations and infrastructure code the OpenACC compiler directives are used. For the soil parameterization, a Fortran based domain specific language (DSL) the CLAW-DSL has been considered. We discuss the challenges associated to port a large community code, about 1 million lines of code, as well as to run simulations on large-scale system at 2.8 km horizontal resolution in terms of run time and I/O constraints. We show performance comparison of the full model on CPU and GPU, achieving a speed up factor of approximately 5x, as well as scaling results on up to 2000 GPU nodes. Finally we discuss challenges and planned development regarding performance portability and high level DSL which will be used with the ICON model in the near future.&lt;/p&gt;


2020 ◽  
Author(s):  
Ioannis Katharopoulos ◽  
Dominique Rust ◽  
Martin Vollmer ◽  
Dominik Brunner ◽  
Stefan Reimann ◽  
...  

&lt;p&gt;Climate change is one of the biggest challenges of the modern era. Halocarbons contribute already about 14% to current anthropogenic radiative forcing, and their future impact may become significantly larger due to their long atmospheric lifetimes and continued and increasing usage. In addition to their influence on climate change, chlorine and bromine-containing halocarbons are the main drivers of the destruction of the stratospheric ozone layer. Therefore, observing their atmospheric abundance and quantifying their sources is critical for predicting the related future impact on climate change and on the recovery of the stratospheric ozone layer.&lt;/p&gt;&lt;p&gt;Regional scale atmospheric inverse modelling can provide observation-based estimates of greenhouse gas emissions at a country scale and, hence, makes valuable information available to policy makers when reviewing emission mitigation strategies and confirming the countries' pledges for emission reduction. Considering that inverse modelling relies on accurate atmospheric transport modelling any advances to the latter are of key importance. The main objective of this work is to characterize and improve the Lagrangian particle dispersion model (LPDM) FLEXPART-COSMO at kilometer-scale resolution and to provide estimates of Swiss halocarbon emissions by integrating newly available halocarbon observations from the Swiss Plateau at the Berom&amp;#252;nster tall tower. The transport model is offline coupled with the regional numerical weather prediction model (NWP) COSMO. Previous inverse modelling results for Swiss greenhouse gases are based on a model resolution of 7 km x 7 km. Here, we utilize higher resolution (1 km x 1 km) operational COSMO analysis fields to drive FLEXPART and compare these to the previous results.&lt;/p&gt;&lt;p&gt;The higher resolution simulations exhibit increased three-dimensional dispersion, leading to a general underestimation of observed tracer concentration at the receptor location and when compared to the coarse model results. The concentration discrepancies due to dispersion between the two model versions cannot be explained by the parameters utilized in FLEPXART&amp;#8217;s turbulence parameterization, (Obhukov length, surface momentum and heat fluxes, atmospheric boundary layer heights, and horizontal and vertical wind speeds), since a direct comparison of these parameters between the different model versions showed no significant differences. The latter suggests that the dispersion differences may originate from a duplication of turbulent transport, on the one hand, covered by the high resolution grid of the Eulerian model and, on the other hand, diagnosed by FLEXPART's turbulence scheme. In an attempt to reconcile FLEXPART-COSMO&amp;#8217;s turbulence scheme at high resolution, we introduced additional scaling parameters based on analysis of simulated mole fraction deviations depending on stability regime. In addition, we used FLEXPART-COSMO source sensitivities in a Bayesian inversion to obtain optimized emission estimates. Inversions for both the high and low resolution models were carried out in order&amp;#160;to quantify the impact of model resolution on posterior emissions and estimate about the uncertainties of these emissions.&amp;#160;&amp;#160;&lt;/p&gt;


2015 ◽  
Vol 143 (10) ◽  
pp. 4012-4037 ◽  
Author(s):  
Colin M. Zarzycki ◽  
Christiane Jablonowski

Abstract Tropical cyclone (TC) forecasts at 14-km horizontal resolution (0.125°) are completed using variable-resolution (V-R) grids within the Community Atmosphere Model (CAM). Forecasts are integrated twice daily from 1 August to 31 October for both 2012 and 2013, with a high-resolution nest centered over the North Atlantic and eastern Pacific Ocean basins. Using the CAM version 5 (CAM5) physical parameterization package, regional refinement is shown to significantly increase TC track forecast skill relative to unrefined grids (55 km, 0.5°). For typical TC forecast integration periods (approximately 1 week), V-R forecasts are able to nearly identically reproduce the flow field of a globally uniform high-resolution forecast. Simulated intensity is generally too strong for forecasts beyond 72 h. This intensity bias is robust regardless of whether the forecast is forced with observed or climatological sea surface temperatures and is not significantly mitigated in a suite of sensitivity simulations aimed at investigating the impact of model time step and CAM’s deep convection parameterization. Replacing components of the default physics with Cloud Layers Unified by Binormals (CLUBB) produces a statistically significant improvement in forecast intensity at longer lead times, although significant structural differences in forecasted TCs exist. CAM forecasts the recurvature of Hurricane Sandy into the northeastern United States 60 h earlier than the Global Forecast System (GFS) model using identical initial conditions, demonstrating the sensitivity of TC forecasts to model configuration. Computational costs associated with V-R simulations are dramatically decreased relative to globally uniform high-resolution simulations, demonstrating that variable-resolution techniques are a promising tool for future numerical weather prediction applications.


2014 ◽  
Vol 142 (5) ◽  
pp. 1962-1981 ◽  
Author(s):  
Linus Magnusson ◽  
Jean-Raymond Bidlot ◽  
Simon T. K. Lang ◽  
Alan Thorpe ◽  
Nils Wedi ◽  
...  

Abstract On 30 October 2012 Hurricane Sandy made landfall on the U.S. East Coast with a devastating impact. Here the performance of the ECMWF forecasts (both high resolution and ensemble) are evaluated together with ensemble forecasts from other numerical weather prediction centers, available from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) archive. The sensitivity to sea surface temperature (SST) and model resolution for the ECMWF forecasts are explored. The results show that the ECMWF forecasts provided a clear indication of the landfall from 7 days in advance. Comparing ensemble forecasts from different centers, the authors find the ensemble forecasts from ECMWF to be the most consistent in the forecast of the landfall of Sandy on the New Jersey coastline. The impact of the warm SST anomaly off the U.S. East Coast is investigated by running sensitivity experiments with climatological SST instead of persisting the SST anomaly from the analysis. The results show that the SST anomaly had a small effect on Sandy’s track in the forecast, but the forecasts initialized with the warm SST anomaly feature a more intense system in terms of the depth of the cyclone, wind speeds, and precipitation. Furthermore, the role of spatial resolution is investigated by comparing four global simulations, spanning from TL159 (150 km) to TL3999 (5 km) horizontal resolution. Forecasts from 3 and 5 days before the landfall are evaluated. While all resolutions predict Sandy’s landfall, at very high resolution the tropical cyclone intensity and the oceanic wave forecasts are greatly improved.


Author(s):  
Nils P. Wedi

The steady path of doubling the global horizontal resolution approximately every 8 years in numerical weather prediction (NWP) at the European Centre for Medium Range Weather Forecasts may be substan- tially altered with emerging novel computing architectures. It coincides with the need to appropriately address and determine forecast uncertainty with increasing resolution, in particular, when convective-scale motions start to be resolved. Blunt increases in the model resolution will quickly become unaffordable and may not lead to improved NWP forecasts. Consequently, there is a need to accordingly adjust proven numerical techniques. An informed decision on the modelling strategy for harnessing exascale, massively parallel computing power thus also requires a deeper understanding of the sensitivity to uncertainty—for each part of the model—and ultimately a deeper understanding of multi-scale interactions in the atmosphere and their numerical realization in ultra-high-resolution NWP and climate simulations. This paper explores opportunities for substantial increases in the forecast efficiency by judicious adjustment of the formal accuracy or relative resolution in the spectral and physical space. One path is to reduce the formal accuracy by which the spectral transforms are computed. The other pathway explores the importance of the ratio used for the horizontal resolution in gridpoint space versus wavenumbers in spectral space. This is relevant for both high-resolution simulations as well as ensemble-based uncertainty estimation.


2007 ◽  
Vol 10 ◽  
pp. 125-131 ◽  
Author(s):  
M. Steinheimer ◽  
T. Haiden

Abstract. The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis) developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite), forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL) flow convergence and specific humidity, lifted condensation level (LCL), convective available potential energy (CAPE), convective inhibition (CIN), and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.


2020 ◽  
Vol 20 (3) ◽  
pp. 1795-1816 ◽  
Author(s):  
Ingrid Super ◽  
Stijn N. C. Dellaert ◽  
Antoon J. H. Visschedijk ◽  
Hugo A. C. Denier van der Gon

Abstract. Quantification of greenhouse gas emissions is receiving a lot of attention because of its relevance for climate mitigation. Complementary to official reported bottom-up emission inventories, quantification can be done with an inverse modelling framework, combining atmospheric transport models, prior gridded emission inventories and a network of atmospheric observations to optimize the emission inventories. An important aspect of such a method is a correct quantification of the uncertainties in all aspects of the modelling framework. The uncertainties in gridded emission inventories are, however, not systematically analysed. In this work, a statistically coherent method is used to quantify the uncertainties in a high-resolution gridded emission inventory of CO2 and CO for Europe. We perform a range of Monte Carlo simulations to determine the effect of uncertainties in different inventory components, including the spatial and temporal distribution, on the uncertainty in total emissions and the resulting atmospheric mixing ratios. We find that the uncertainties in the total emissions for the selected domain are 1 % for CO2 and 6 % for CO. Introducing spatial disaggregation causes a significant increase in the uncertainty of up to 40 % for CO2 and 70 % for CO for specific grid cells. Using gridded uncertainties, specific regions can be defined that have the largest uncertainty in emissions and are thus an interesting target for inverse modellers. However, the largest sectors are usually the best-constrained ones (low relative uncertainty), so the absolute uncertainty is the best indicator for this. With this knowledge, areas can be identified that are most sensitive to the largest emission uncertainties, which supports network design.


2019 ◽  
Vol 19 (11) ◽  
pp. 7347-7376 ◽  
Author(s):  
Anna Agustí-Panareda ◽  
Michail Diamantakis ◽  
Sébastien Massart ◽  
Frédéric Chevallier ◽  
Joaquín Muñoz-Sabater ◽  
...  

Abstract. Climate change mitigation efforts require information on the current greenhouse gas atmospheric concentrations and their sources and sinks. Carbon dioxide (CO2) is the most abundant anthropogenic greenhouse gas. Its variability in the atmosphere is modulated by the synergy between weather and CO2 surface fluxes, often referred to as CO2 weather. It is interpreted with the help of global or regional numerical transport models, with horizontal resolutions ranging from a few hundreds of kilometres to a few kilometres. Changes in the model horizontal resolution affect not only atmospheric transport but also the representation of topography and surface CO2 fluxes. This paper assesses the impact of horizontal resolution on the simulated atmospheric CO2 variability with a numerical weather prediction model. The simulations are performed using the Copernicus Atmosphere Monitoring Service (CAMS) CO2 forecasting system at different resolutions from 9 to 80 km and are evaluated using in situ atmospheric surface measurements and atmospheric column-mean observations of CO2, as well as radiosonde and SYNOP observations of the winds. The results indicate that both diurnal and day-to-day variability of atmospheric CO2 are generally better represented at high resolution, as shown by a reduction in the errors in simulated wind and CO2. Mountain stations display the largest improvements at high resolution as they directly benefit from the more realistic orography. In addition, the CO2 spatial gradients are generally improved with increasing resolution for both stations near the surface and those observing the total column, as the overall inter-station error is also reduced in magnitude. However, close to emission hotspots, the high resolution can also lead to a deterioration of the simulation skill, highlighting uncertainties in the high-resolution fluxes that are more diffuse at lower resolutions. We conclude that increasing horizontal resolution matters for modelling CO2 weather because it has the potential to bring together improvements in the surface representation of both winds and CO2 fluxes, as well as an expected reduction in numerical errors of transport. Modelling applications like atmospheric inversion systems to estimate surface fluxes will only be able to benefit fully from upgrades in horizontal resolution if the topography, winds and prior flux distribution are also upgraded accordingly. It is clear from the results that an additional increase in resolution might reduce errors even further. However, the horizontal resolution sensitivity tests indicate that the change in the CO2 and wind modelling error with resolution is not linear, making it difficult to quantify the improvement beyond the tested resolutions. Finally, we show that the high-resolution simulations are useful for the assessment of the small-scale variability of CO2 which cannot be represented in coarser-resolution models. These representativeness errors need to be considered when assimilating in situ data and high-resolution satellite data such as Greenhouse gases Observing Satellite (GOSAT), Orbiting Carbon Observatory-2 (OCO-2), the Chinese Carbon Dioxide Observation Satellite Mission (TanSat) and future missions such as the Geostationary Carbon Observatory (GeoCarb) and the Sentinel satellite constellation for CO2. For these reasons, the high-resolution CO2 simulations provided by the CAMS in real time can be useful to estimate such small-scale variability in real time, as well as providing boundary conditions for regional modelling studies and supporting field experiments.


Sign in / Sign up

Export Citation Format

Share Document