scholarly journals Radiative properties of mid-latitude cirrus clouds derived by automatic evaluation of lidar measurements

Author(s):  
Erika Kienast-Sjögren ◽  
Christian Rolf ◽  
Patric Seifert ◽  
Ulrich K. Krieger ◽  
Bei P. Luo ◽  
...  

Abstract. Cirrus, i.e. high thin clouds that are fully glaciated, play an important role in the Earth's radiation budget as they interact with both long- and shortwave radiation and determine the water vapor budget of the upper troposphere and stratosphere. Here, we present a climatology of mid-latitude cirrus clouds measured with the same type of ground-based lidar at three mid-latitude research stations: at the Swiss high alpine Jungfraujoch station (3580 m a.s.l.), in Zürich (Switzerland, 510 m a.s.l.) and in Jülich (Germany, 100 m a.s.l.). The analysis is based on 13'000 hours of measurements from 2010–2014. To automatically evaluate this extensive data set, we have developed the "Fast LIdar Cirrus Algorithm" (FLICA), which combines a pixel-based cloud-detection scheme with the classic lidar evaluation techniques. We find mean cirrus optical depths of 0.12 on Jungfraujoch and of 0.14 and 0.17 in Zürich and Jülich, respectively. Above Jungfraujoch, subvisible cirrus clouds (τ < 0.03) have been observed during 7 % of the observation time, whereas above Zürich and Jülich significantly less. From Jungfraujoch, clouds with τ < 10−3 can be observed three times more often than over Zürich and Jülich, and clouds with τ < 2 × 10−4 even ten times more often. Above Jungfraujoch, cirrus have been observed to altitudes of 14.4 km a.s.l., whereas only to about 1 km lower at the other stations. These features highlight the advantage of the high-altitude station Jungfraujoch, which is often in the free troposphere above the polluted boundary layer, thus allowing to perform lidar measurements of thinner and higher clouds. In addition, the measurements suggest a change in cloud morphology at Jungfraujoch above ∼ 13 km, possibly because high particle number densities form in the observed cirrus clouds, when many ice crystals nucleate in the high supersaturations following rapid uplifts in lee waves above mountainous terrain. The retrieved optical properties are used as input for a radiative transfer model to estimate the net cloud radiative forcing, CRFNET, for the analysed cirrus clouds. All cirrus detected here have a positive CRFNET. This confirms that these thin, high cirrus have a warming effect on the Earth's climate, whereas cooling clouds typically have lower cloud edges too low in altitude to satisfy the FLICA criterion of temperatures below −38 °C. We find CRFNET = 0.9 Wm−2 for Jungfraujoch and 1.0 Wm−2 (1.7 Wm−2) for Zürich (Jülich). Further, we calculate that subvisibe cirrus (τ < 0.03) contribute about 5 %, thin cirrus (0.03 < τ < 0.3) about 45 % and opaque cirrus (0.3 < τ) about 50 % of the total cirrus radiative forcing.

2016 ◽  
Vol 16 (12) ◽  
pp. 7605-7621 ◽  
Author(s):  
Erika Kienast-Sjögren ◽  
Christian Rolf ◽  
Patric Seifert ◽  
Ulrich K. Krieger ◽  
Bei P. Luo ◽  
...  

Abstract. Cirrus, i.e., high, thin clouds that are fully glaciated, play an important role in the Earth's radiation budget as they interact with both long- and shortwave radiation and affect the water vapor budget of the upper troposphere and stratosphere. Here, we present a climatology of midlatitude cirrus clouds measured with the same type of ground-based lidar at three midlatitude research stations: at the Swiss high alpine Jungfraujoch station (3580 m a.s.l.), in Zürich (Switzerland, 510 m a.s.l.), and in Jülich (Germany, 100 m a.s.l.). The analysis is based on 13 000 h of measurements from 2010 to 2014. To automatically evaluate this extensive data set, we have developed the Fast LIdar Cirrus Algorithm (FLICA), which combines a pixel-based cloud-detection scheme with the classic lidar evaluation techniques. We find mean cirrus optical depths of 0.12 on Jungfraujoch and of 0.14 and 0.17 in Zürich and Jülich, respectively. Above Jungfraujoch, subvisible cirrus clouds (τ < 0.03) have been observed during 6 % of the observation time, whereas above Zürich and Jülich fewer clouds of that type were observed. Cirrus have been observed up to altitudes of 14.4 km a.s.l. above Jungfraujoch, whereas they have only been observed to about 1 km lower at the other stations. These features highlight the advantage of the high-altitude station Jungfraujoch, which is often in the free troposphere above the polluted boundary layer, thus enabling lidar measurements of thinner and higher clouds. In addition, the measurements suggest a change in cloud morphology at Jungfraujoch above ∼ 13 km, possibly because high particle number densities form in the observed cirrus clouds, when many ice crystals nucleate in the high supersaturations following rapid uplifts in lee waves above mountainous terrain. The retrieved optical properties are used as input for a radiative transfer model to estimate the net cloud radiative forcing, CRFNET, for the analyzed cirrus clouds. All cirrus detected here have a positive CRFNET. This confirms that these thin, high cirrus have a warming effect on the Earth's climate, whereas cooling clouds typically have cloud edges too low in altitude to satisfy the FLICA criterion of temperatures below −38 °C. We find CRFNET = 0.9 W m−2 for Jungfraujoch and 1.0 W m−2 (1.7 W m−2) for Zürich (Jülich). Further, we calculate that subvisible cirrus (τ < 0.03) contribute about 5 %, thin cirrus (0.03 < τ < 0.3) about 45 %, and opaque cirrus (0.3 < τ) about 50 % of the total cirrus radiative forcing.


2005 ◽  
Vol 5 (10) ◽  
pp. 2847-2867 ◽  
Author(s):  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
A. Fotiadi ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The monthly mean shortwave (SW) radiation budget at the Earth's surface (SRB) was computed on 2.5-degree longitude-latitude resolution for the 17-year period from 1984 to 2000, using a radiative transfer model accounting for the key physical parameters that determine the surface SRB, and long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2). The model input data were supplemented by data from the National Centers for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR) and European Center for Medium Range Weather Forecasts (ECMWF) Global Reanalysis projects, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model surface radiative fluxes were validated against surface measurements from 22 stations of the Baseline Surface Radiation Network (BSRN) covering the years 1992-2000, and from 700 stations of the Global Energy Balance Archive (GEBA), covering the period 1984-2000. The model is in good agreement with BSRN and GEBA, with a negative bias of 14 and 6.5 Wm-2, respectively. The model is able to reproduce interesting features of the seasonal and geographical variation of the surface SW fluxes at global scale. Based on the 17-year average model results, the global mean SW downward surface radiation (DSR) is equal to 171.6 Wm-2, whereas the net downward (or absorbed) surface SW radiation is equal to 149.4 Wm-2, values that correspond to 50.2 and 43.7% of the incoming SW radiation at the top of the Earth's atmosphere. These values involve a long-term surface albedo equal to 12.9%. Significant increasing trends in DSR and net DSR fluxes were found, equal to 4.1 and 3.7 Wm-2, respectively, over the 1984-2000 period (equivalent to 2.4 and 2.2 Wm-2 per decade), indicating an increasing surface solar radiative heating. This surface SW radiative heating is primarily attributed to clouds, especially low-level, and secondarily to other parameters such as total precipitable water. The surface solar heating occurs mainly in the period starting from the early 1990s, in contrast to decreasing trend in DSR through the late 1980s. The computed global mean DSR and net DSR flux anomalies were found to range within ±8 and ±6 Wm-2, respectively, with signals from El Niño and La Niña events, and the Pinatubo eruption, whereas significant positive anomalies have occurred in the period 1992-2000.


2004 ◽  
Vol 4 (5) ◽  
pp. 1217-1235 ◽  
Author(s):  
N. Hatzianastassiou ◽  
A. Fotiadi ◽  
Ch. Matsoukas ◽  
K. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The mean monthly shortwave (SW) radiation budget at the top of atmosphere (TOA) was computed on 2.5° longitude-latitude resolution for the 14-year period from 1984 to 1997, using a radiative transfer model with long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2) supplemented by data from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP-NCAR) Global Reanalysis project, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model radiative fluxes at TOA were validated against Earth Radiation Budget Experiment (ERBE) S4 scanner satellite data (1985–1989). The model is able to predict the seasonal and geographical variation of SW TOA fluxes. On a mean annual and global basis, the model is in very good agreement with ERBE, overestimating the outgoing SW radiation at TOA (OSR) by 0.93 Wm-2 (or by 0.92%), within the ERBE uncertainties. At pixel level, the OSR differences between model and ERBE are mostly within ±10 Wm-2, with ±5 Wm-2 over extended regions, while there exist some geographic areas with differences of up to 40 Wm-2, associated with uncertainties in cloud properties and surface albedo. The 14-year average model results give a planetary albedo equal to 29.6% and a TOA OSR flux of 101.2 Wm-2. A significant linearly decreasing trend in OSR and planetary albedo was found, equal to 2.3 Wm-2 and 0.6% (in absolute values), respectively, over the 14-year period (from January 1984 to December 1997), indicating an increasing solar planetary warming. This planetary SW radiative heating occurs in the tropical and sub-tropical areas (20° S–20° N), with clouds being the most likely cause. The computed global mean OSR anomaly ranges within ±4 Wm-2, with signals from El Niño and La Niña events or Pinatubo eruption, whereas significant negative OSR anomalies, starting from year 1992, are also detected.


2004 ◽  
Vol 4 (3) ◽  
pp. 2671-2726 ◽  
Author(s):  
N. Hatzianastassiou ◽  
A. Fotiadi ◽  
Ch. Matsoukas ◽  
K. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The mean monthly shortwave (SW) radiation budget at the top of atmosphere (TOA) was computed on 2.5° longitude-latitude resolution for the 14-year period from 1984 to 1997, using a radiative transfer model with long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2) supplemented by data from the National Centers for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR) Global Reanalysis project, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model radiative fluxes at TOA were validated against Earth Radiation Budget Experiment (ERBE) S4 scanner satellite data (1985–1989). The model is able to predict the seasonal and geographical variation of SW TOA fluxes. On a mean annual and global basis, the model is in very good agreement with ERBE, overestimating the outgoing SW radiation at TOA (OSR) by 0.93 Wm−2 (or by 0.92%), within the ERBE uncertainties. At pixel level, the OSR differences between model and ERBE are mostly within ±10 Wm−2, with ±5 Wm−2 over extended regions, while there exist some geographic areas with differences of up to 40 Wm−2, associated with uncertainties in cloud properties and surface albedo. The 14-year average model results give a planetary albedo equal to 29.6% and a TOA OSR flux of 101.2 Wm-2. A significant linearly decreasing trend in OSR and planetary albedo was found, equal to 2.3 Wm−2 and 0.6% over the 14-year period (from January 1984 to December 1997), indicating an increasing solar planetary warming. This planetary SW radiative heating occurs in the tropical and sub-tropical areas (20° S–20° N), with clouds being the most likely cause. The computed global mean OSR anomaly ranges within ±4 Wm−2, with signals from El Niño and La Niña events or Pinatubo eruption, whereas significant negative OSR anomalies, starting from year 1992, are also detected.


2014 ◽  
Vol 53 (4) ◽  
pp. 1046-1058 ◽  
Author(s):  
Yong-Keun Lee ◽  
Jason A. Otkin ◽  
Thomas J. Greenwald

AbstractSynthetic infrared brightness temperatures (BTs) derived from a high-resolution Weather Research and Forecasting (WRF) model simulation over the contiguous United States are compared with Moderate Resolution Imaging Spectroradiometer (MODIS) observations to assess the accuracy of the model-simulated cloud field. A sophisticated forward radiative transfer model (RTM) is used to compute the synthetic MODIS observations. A detailed comparison of synthetic and real MODIS 11-μm BTs revealed that the model simulation realistically depicts the spatial characteristics of the observed cloud features. Brightness temperature differences (BTDs) computed for 8.5–11 and 11–12 μm indicate that the combined numerical model–RTM system realistically treats the radiative properties associated with optically thin cirrus clouds. For instance, much larger 11–12-μm BTDs occurred within thin clouds surrounding optically thicker, mesoscale cloud features. Although the simulated and observed BTD probability distributions for optically thin cirrus clouds had a similar range of positive values, the synthetic 11-μm BTs were much colder than observed. Previous studies have shown that MODIS cloud optical thickness values tend to be too large for thin cirrus clouds, which contributed to the apparent cold BT bias in the simulated thin cirrus clouds. Errors are substantially reduced after accounting for the observed optical thickness bias, which indicates that the thin cirrus clouds are realistically depicted during the model simulation.


2009 ◽  
Vol 66 (12) ◽  
pp. 3721-3731 ◽  
Author(s):  
Joonsuk Lee ◽  
Ping Yang ◽  
Andrew E. Dessler ◽  
Bo-Cai Gao ◽  
Steven Platnick

Abstract To understand the radiative impact of tropical thin cirrus clouds, the frequency of occurrence and optical depths of these clouds have been derived. “Thin” cirrus clouds are defined here as being those that are not detected by the operational Moderate Resolution Imaging Spectroradiometer (MODIS) cloud mask, corresponding to an optical depth value of approximately 0.3 or smaller, but that are detectable in terms of the cirrus reflectance product based on the MODIS 1.375-μm channel. With such a definition, thin cirrus clouds were present in more than 40% of the pixels flagged as “clear sky” by the operational MODIS cloud mask algorithm. It is shown that these thin cirrus clouds are frequently observed in deep convective regions in the western Pacific. Thin cirrus optical depths were derived from the cirrus reflectance product. Regions of significant cloud fraction and large optical depths were observed in the Northern Hemisphere during the boreal spring and summer and moved southward during the boreal autumn and winter. The radiative effects of tropical thin cirrus clouds were studied on the basis of the retrieved cirrus optical depths, the atmospheric profiles derived from the Atmospheric Infrared Sounder (AIRS) observations, and a radiative transfer model in conjunction with a parameterization of ice cloud spectral optical properties. To understand how these clouds regulate the radiation field in the atmosphere, the instantaneous net fluxes at the top of the atmosphere (TOA) and at the surface were calculated. The present study shows positive and negative net forcings at the TOA and at the surface, respectively. The positive (negative) net forcing at the TOA (surface) is due to the dominance of longwave (shortwave) forcing. Both the TOA and surface forcings are in a range of 0–20 W m−2, depending on the optical depths of thin cirrus clouds.


2005 ◽  
Vol 5 (4) ◽  
pp. 4545-4597 ◽  
Author(s):  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
A. Fotiadi ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The monthly mean shortwave (SW) radiation budget at the Earth's surface (SRB) was computed on 2.5-degree longitude-latitude resolution for the 17-year period from 1984 to 2000, using a radiative transfer model accounting for the key physical parameters that determine the surface SRB, and long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2). The model input data were supplemented by data from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP-NCAR) and European Center for Medium Range Weather Forecasts (ECMWF) Global Reanalysis projects, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model surface radiative fluxes were validated against surface measurements from 22 stations of the Baseline Surface Radiation Network (BSRN) covering the years 1992–2000, and from 700 stations of the Global Energy Balance Archive (GEBA), covering the period 1984–2000. The model is in very good agreement with BSRN and GEBA, with a negative bias of 14 and 6.5 Wm-2, respectively. The model is able to reproduce interesting features of the seasonal and geographical variation of the surface SW fluxes at global scale, which is not possible with surface measurements. Based on the 17-year average model results, the global mean SW downward surface radiation (DSR) is equal to 171.6 Wm−2, whereas the net downward (or absorbed) surface SW radiation is equal to 149.4 Wm−2, values that correspond to 50.2 and 43.7% of the incoming SW radiation at the top of the Earth's atmosphere. These values involve a long-term surface albedo equal to 12.9%. Significant increasing trends in DSR and net DSR fluxes were found, equal to 4.1 and 3.7 Wm−2, respectively, over the 1984–2000 period (equivalent to 2.4 and 2.2 Wm−2 per decade), indicating an increasing surface solar radiative heating. This surface SW radiative heating is primarily attributed to clouds, especially low-level, and secondarily to other parameters such as total precipitable water. The surface solar heating occurs mainly in the period starting from the early 1990s, in contrast to the commonly reported decreasing trend in DSR through the late 1980s, found also in our study. The computed global mean DSR and net DSR flux anomalies were found to range within ±8 and ±6 Wm−2, respectively, with signals from El Niño and La Niña events, and the Pinatubo eruption, whereas significant positive anomalies have occurred in the period 1992–2000.


2019 ◽  
Author(s):  
Maria José Granados-Muñoz ◽  
Michaël Sicard ◽  
Nikolaos Papagiannopoulos ◽  
Rubén Barragán ◽  
Juan Antonio Bravo-Aranda ◽  
...  

Abstract. A demonstration study to examine the feasibility to retrieve dust radiative effects based on combined satellite data from MODIS (Moderate Resolution Imaging Spectroradiometer), CERES (Clouds and the Earth’s Radiant Energy System) and CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) li-dar vertical profiles along their orbit is presented. The radiative transfer model GAME (Global Atmos-pheric Model) is used to estimate the shortwave and longwave dust radiative effects below the CALIP-SO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite) orbit assuming an aerosol parameterization based on CALIOP vertical distribution at a horizontal resolution of 5 km and additional AERONET (Aerosol Robotic Network) data. Two study cases are analysed; a strong long-range transport mineral dust event (AOD = 0.52) originated in the Sahara Desert and reaching the United Kingdom and a weaker event (AOD = 0.16) affecting Eastern Europe. The obtained radiative fluxes are first validated in terms of radiative forcing efficiency at a single point with space-time co-located lidar ground-based measurements from EARLINET (European Aerosol Research Lidar Network) stations below the orbit. The methodology is then applied to the full orbit. The obtained results indicate that the radiative effects show a strong dependence on the aerosol load, highlighting the need of accurate AOD measurements for forcing studies, and on the surface albedo. The calculated dust radiative effects and heating rates below the orbits are in good agreement with previous studies of mineral dust, with the forcing efficiency obtained at the surface ranging between −80.3 and −63.0 W m−2 for the weaker event and −119.1 and −79.3 W m−2 for the strong one. Results thus demonstrate the validity of the presented method to retrieve 2-D accurate radiative properties with large spatial and temporal coverage.


Atmosphere ◽  
2018 ◽  
Vol 9 (7) ◽  
pp. 271 ◽  
Author(s):  
Erica Alston ◽  
Irina Sokolik

Aerosols and their radiative properties play an integral part in understanding Earth’s climate. It is becoming increasingly common to examine aerosol’s radiative impacts on a regional scale. The primary goal of this research is to explore the impacts of regional aerosol’s forcing at the surface and top-of-atmosphere (TOA) in the south-eastern U.S. by using a 1-D radiative transfer model. By using test cases that are representative of conditions common to this region, an estimate of aerosol forcing can be compared to other results. Speciation data and aerosol layer analysis provide the basis for the modeling. Results indicate that the region experiences TOA cooling year-round, where the winter has TOA forcings between −2.8 and −5 W/m2, and the summer has forcings between −5 and −15 W/m2 for typical atmospheric conditions. Surface level forcing efficiencies are greater than those estimated for the TOA for all cases considered i.e., urban and non-urban background conditions. One potential implication of this research is that regional aerosol mixtures have effects that are not well captured in global climate model estimates, which has implications for a warming climate where all radiative inputs are not well characterized, thus increasing the ambiguity in determining regional climate impacts.


2005 ◽  
Vol 5 (1) ◽  
pp. 455-480
Author(s):  
A. Fotiadi ◽  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. A decadal-scale trend in the tropical radiative energy budget has been observed recently by satellites, which however is not reproduced by climate models. In the present study, we have computed the outgoing shortwave radiation (OSR) at the top of atmosphere (TOA) at 2.5° longitude-latitude resolution and on a mean monthly basis for the 17-year period 1984–2000, by using a deterministic solar radiative transfer model and cloud climatological data from the International Satellite Cloud Climatology Project (ISCCP) D2 database. Atmospheric temperature and humidity vertical profiles, as well as other supplementary data, were taken from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP/NCAR) and the European Center for Medium-Range Weather Forecasts (ECMWF) Global Reanalysis Projects, while other global databases, such as the Global Aerosol Data Set (GADS) for aerosol data, were also used. Anomaly time series for the mean monthly pixel-level OSR fluxes, as well as for the key physical parameters, were constructed. A significant decreasing trend in OSR anomalies, starting mainly from the late 1980s, was found in tropical and subtropical regions (30° S–30° N), indicating an increase in solar planetary heating equal to 3.2±0.5 Wm-2 over the 17-year time period from 1984 to 2000 or 1.9±0.3 Wm-2/decade, reproducing well the features recorded by satellite observations, in contrast to climate model results. The model computed trend is in good agreement with the corresponding linear decrease of 3.7±0.5 Wm-2 (or 2.5±0.4 Wm-2/decade) in tropical mean OSR anomalies derived from ERBE S-10N non-scanner data. An attempt was made to identify the physical processes responsible for the decreasing trend in tropical mean OSR. A detailed correlation analysis using pixel-level anomalies of OSR flux and ISCCP cloud cover over the entire tropical and subtropical region (30° S–30° N), gave a correlation coefficient of 0.79, indicating that decreasing cloud cover is the main reason for the tropical OSR trend. According to the ISCCP-D2 data derived from the combined visible/infrared (VIS/IR) analysis, the tropical cloud cover has decreased by 6.6±0.2% per decade, in relative terms. A detailed analysis of the inter-annual and long-term variability of the various parameters determining the OSR at TOA, has shown that the most important contribution to the observed OSR trend comes from a decrease in low-level cloud cover over the period 1984–2000, followed by decreases in middle and high-level cloud cover. Opposite but small trends are introduced by increases in cloud scattering optical depth of low and middle clouds.


Sign in / Sign up

Export Citation Format

Share Document