scholarly journals Long-term global distribution of earth’s shortwave radiation budget at the top of atmosphere

2004 ◽  
Vol 4 (3) ◽  
pp. 2671-2726 ◽  
Author(s):  
N. Hatzianastassiou ◽  
A. Fotiadi ◽  
Ch. Matsoukas ◽  
K. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The mean monthly shortwave (SW) radiation budget at the top of atmosphere (TOA) was computed on 2.5° longitude-latitude resolution for the 14-year period from 1984 to 1997, using a radiative transfer model with long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2) supplemented by data from the National Centers for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR) Global Reanalysis project, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model radiative fluxes at TOA were validated against Earth Radiation Budget Experiment (ERBE) S4 scanner satellite data (1985–1989). The model is able to predict the seasonal and geographical variation of SW TOA fluxes. On a mean annual and global basis, the model is in very good agreement with ERBE, overestimating the outgoing SW radiation at TOA (OSR) by 0.93 Wm−2 (or by 0.92%), within the ERBE uncertainties. At pixel level, the OSR differences between model and ERBE are mostly within ±10 Wm−2, with ±5 Wm−2 over extended regions, while there exist some geographic areas with differences of up to 40 Wm−2, associated with uncertainties in cloud properties and surface albedo. The 14-year average model results give a planetary albedo equal to 29.6% and a TOA OSR flux of 101.2 Wm-2. A significant linearly decreasing trend in OSR and planetary albedo was found, equal to 2.3 Wm−2 and 0.6% over the 14-year period (from January 1984 to December 1997), indicating an increasing solar planetary warming. This planetary SW radiative heating occurs in the tropical and sub-tropical areas (20° S–20° N), with clouds being the most likely cause. The computed global mean OSR anomaly ranges within ±4 Wm−2, with signals from El Niño and La Niña events or Pinatubo eruption, whereas significant negative OSR anomalies, starting from year 1992, are also detected.

2004 ◽  
Vol 4 (5) ◽  
pp. 1217-1235 ◽  
Author(s):  
N. Hatzianastassiou ◽  
A. Fotiadi ◽  
Ch. Matsoukas ◽  
K. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The mean monthly shortwave (SW) radiation budget at the top of atmosphere (TOA) was computed on 2.5° longitude-latitude resolution for the 14-year period from 1984 to 1997, using a radiative transfer model with long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2) supplemented by data from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP-NCAR) Global Reanalysis project, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model radiative fluxes at TOA were validated against Earth Radiation Budget Experiment (ERBE) S4 scanner satellite data (1985–1989). The model is able to predict the seasonal and geographical variation of SW TOA fluxes. On a mean annual and global basis, the model is in very good agreement with ERBE, overestimating the outgoing SW radiation at TOA (OSR) by 0.93 Wm-2 (or by 0.92%), within the ERBE uncertainties. At pixel level, the OSR differences between model and ERBE are mostly within ±10 Wm-2, with ±5 Wm-2 over extended regions, while there exist some geographic areas with differences of up to 40 Wm-2, associated with uncertainties in cloud properties and surface albedo. The 14-year average model results give a planetary albedo equal to 29.6% and a TOA OSR flux of 101.2 Wm-2. A significant linearly decreasing trend in OSR and planetary albedo was found, equal to 2.3 Wm-2 and 0.6% (in absolute values), respectively, over the 14-year period (from January 1984 to December 1997), indicating an increasing solar planetary warming. This planetary SW radiative heating occurs in the tropical and sub-tropical areas (20° S–20° N), with clouds being the most likely cause. The computed global mean OSR anomaly ranges within ±4 Wm-2, with signals from El Niño and La Niña events or Pinatubo eruption, whereas significant negative OSR anomalies, starting from year 1992, are also detected.


2005 ◽  
Vol 5 (10) ◽  
pp. 2847-2867 ◽  
Author(s):  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
A. Fotiadi ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The monthly mean shortwave (SW) radiation budget at the Earth's surface (SRB) was computed on 2.5-degree longitude-latitude resolution for the 17-year period from 1984 to 2000, using a radiative transfer model accounting for the key physical parameters that determine the surface SRB, and long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2). The model input data were supplemented by data from the National Centers for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR) and European Center for Medium Range Weather Forecasts (ECMWF) Global Reanalysis projects, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model surface radiative fluxes were validated against surface measurements from 22 stations of the Baseline Surface Radiation Network (BSRN) covering the years 1992-2000, and from 700 stations of the Global Energy Balance Archive (GEBA), covering the period 1984-2000. The model is in good agreement with BSRN and GEBA, with a negative bias of 14 and 6.5 Wm-2, respectively. The model is able to reproduce interesting features of the seasonal and geographical variation of the surface SW fluxes at global scale. Based on the 17-year average model results, the global mean SW downward surface radiation (DSR) is equal to 171.6 Wm-2, whereas the net downward (or absorbed) surface SW radiation is equal to 149.4 Wm-2, values that correspond to 50.2 and 43.7% of the incoming SW radiation at the top of the Earth's atmosphere. These values involve a long-term surface albedo equal to 12.9%. Significant increasing trends in DSR and net DSR fluxes were found, equal to 4.1 and 3.7 Wm-2, respectively, over the 1984-2000 period (equivalent to 2.4 and 2.2 Wm-2 per decade), indicating an increasing surface solar radiative heating. This surface SW radiative heating is primarily attributed to clouds, especially low-level, and secondarily to other parameters such as total precipitable water. The surface solar heating occurs mainly in the period starting from the early 1990s, in contrast to decreasing trend in DSR through the late 1980s. The computed global mean DSR and net DSR flux anomalies were found to range within ±8 and ±6 Wm-2, respectively, with signals from El Niño and La Niña events, and the Pinatubo eruption, whereas significant positive anomalies have occurred in the period 1992-2000.


2005 ◽  
Vol 5 (4) ◽  
pp. 4545-4597 ◽  
Author(s):  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
A. Fotiadi ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. The monthly mean shortwave (SW) radiation budget at the Earth's surface (SRB) was computed on 2.5-degree longitude-latitude resolution for the 17-year period from 1984 to 2000, using a radiative transfer model accounting for the key physical parameters that determine the surface SRB, and long-term climatological data from the International Satellite Cloud Climatology Project (ISCCP-D2). The model input data were supplemented by data from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP-NCAR) and European Center for Medium Range Weather Forecasts (ECMWF) Global Reanalysis projects, and other global data bases such as TIROS Operational Vertical Sounder (TOVS) and Global Aerosol Data Set (GADS). The model surface radiative fluxes were validated against surface measurements from 22 stations of the Baseline Surface Radiation Network (BSRN) covering the years 1992–2000, and from 700 stations of the Global Energy Balance Archive (GEBA), covering the period 1984–2000. The model is in very good agreement with BSRN and GEBA, with a negative bias of 14 and 6.5 Wm-2, respectively. The model is able to reproduce interesting features of the seasonal and geographical variation of the surface SW fluxes at global scale, which is not possible with surface measurements. Based on the 17-year average model results, the global mean SW downward surface radiation (DSR) is equal to 171.6 Wm−2, whereas the net downward (or absorbed) surface SW radiation is equal to 149.4 Wm−2, values that correspond to 50.2 and 43.7% of the incoming SW radiation at the top of the Earth's atmosphere. These values involve a long-term surface albedo equal to 12.9%. Significant increasing trends in DSR and net DSR fluxes were found, equal to 4.1 and 3.7 Wm−2, respectively, over the 1984–2000 period (equivalent to 2.4 and 2.2 Wm−2 per decade), indicating an increasing surface solar radiative heating. This surface SW radiative heating is primarily attributed to clouds, especially low-level, and secondarily to other parameters such as total precipitable water. The surface solar heating occurs mainly in the period starting from the early 1990s, in contrast to the commonly reported decreasing trend in DSR through the late 1980s, found also in our study. The computed global mean DSR and net DSR flux anomalies were found to range within ±8 and ±6 Wm−2, respectively, with signals from El Niño and La Niña events, and the Pinatubo eruption, whereas significant positive anomalies have occurred in the period 1992–2000.


2016 ◽  
Vol 16 (12) ◽  
pp. 7605-7621 ◽  
Author(s):  
Erika Kienast-Sjögren ◽  
Christian Rolf ◽  
Patric Seifert ◽  
Ulrich K. Krieger ◽  
Bei P. Luo ◽  
...  

Abstract. Cirrus, i.e., high, thin clouds that are fully glaciated, play an important role in the Earth's radiation budget as they interact with both long- and shortwave radiation and affect the water vapor budget of the upper troposphere and stratosphere. Here, we present a climatology of midlatitude cirrus clouds measured with the same type of ground-based lidar at three midlatitude research stations: at the Swiss high alpine Jungfraujoch station (3580 m a.s.l.), in Zürich (Switzerland, 510 m a.s.l.), and in Jülich (Germany, 100 m a.s.l.). The analysis is based on 13 000 h of measurements from 2010 to 2014. To automatically evaluate this extensive data set, we have developed the Fast LIdar Cirrus Algorithm (FLICA), which combines a pixel-based cloud-detection scheme with the classic lidar evaluation techniques. We find mean cirrus optical depths of 0.12 on Jungfraujoch and of 0.14 and 0.17 in Zürich and Jülich, respectively. Above Jungfraujoch, subvisible cirrus clouds (τ < 0.03) have been observed during 6 % of the observation time, whereas above Zürich and Jülich fewer clouds of that type were observed. Cirrus have been observed up to altitudes of 14.4 km a.s.l. above Jungfraujoch, whereas they have only been observed to about 1 km lower at the other stations. These features highlight the advantage of the high-altitude station Jungfraujoch, which is often in the free troposphere above the polluted boundary layer, thus enabling lidar measurements of thinner and higher clouds. In addition, the measurements suggest a change in cloud morphology at Jungfraujoch above ∼ 13 km, possibly because high particle number densities form in the observed cirrus clouds, when many ice crystals nucleate in the high supersaturations following rapid uplifts in lee waves above mountainous terrain. The retrieved optical properties are used as input for a radiative transfer model to estimate the net cloud radiative forcing, CRFNET, for the analyzed cirrus clouds. All cirrus detected here have a positive CRFNET. This confirms that these thin, high cirrus have a warming effect on the Earth's climate, whereas cooling clouds typically have cloud edges too low in altitude to satisfy the FLICA criterion of temperatures below −38 °C. We find CRFNET = 0.9 W m−2 for Jungfraujoch and 1.0 W m−2 (1.7 W m−2) for Zürich (Jülich). Further, we calculate that subvisible cirrus (τ < 0.03) contribute about 5 %, thin cirrus (0.03 < τ < 0.3) about 45 %, and opaque cirrus (0.3 < τ) about 50 % of the total cirrus radiative forcing.


2016 ◽  
Author(s):  
Erika Kienast-Sjögren ◽  
Christian Rolf ◽  
Patric Seifert ◽  
Ulrich K. Krieger ◽  
Bei P. Luo ◽  
...  

Abstract. Cirrus, i.e. high thin clouds that are fully glaciated, play an important role in the Earth's radiation budget as they interact with both long- and shortwave radiation and determine the water vapor budget of the upper troposphere and stratosphere. Here, we present a climatology of mid-latitude cirrus clouds measured with the same type of ground-based lidar at three mid-latitude research stations: at the Swiss high alpine Jungfraujoch station (3580 m a.s.l.), in Zürich (Switzerland, 510 m a.s.l.) and in Jülich (Germany, 100 m a.s.l.). The analysis is based on 13'000 hours of measurements from 2010–2014. To automatically evaluate this extensive data set, we have developed the "Fast LIdar Cirrus Algorithm" (FLICA), which combines a pixel-based cloud-detection scheme with the classic lidar evaluation techniques. We find mean cirrus optical depths of 0.12 on Jungfraujoch and of 0.14 and 0.17 in Zürich and Jülich, respectively. Above Jungfraujoch, subvisible cirrus clouds (τ < 0.03) have been observed during 7 % of the observation time, whereas above Zürich and Jülich significantly less. From Jungfraujoch, clouds with τ < 10−3 can be observed three times more often than over Zürich and Jülich, and clouds with τ < 2 × 10−4 even ten times more often. Above Jungfraujoch, cirrus have been observed to altitudes of 14.4 km a.s.l., whereas only to about 1 km lower at the other stations. These features highlight the advantage of the high-altitude station Jungfraujoch, which is often in the free troposphere above the polluted boundary layer, thus allowing to perform lidar measurements of thinner and higher clouds. In addition, the measurements suggest a change in cloud morphology at Jungfraujoch above ∼ 13 km, possibly because high particle number densities form in the observed cirrus clouds, when many ice crystals nucleate in the high supersaturations following rapid uplifts in lee waves above mountainous terrain. The retrieved optical properties are used as input for a radiative transfer model to estimate the net cloud radiative forcing, CRFNET, for the analysed cirrus clouds. All cirrus detected here have a positive CRFNET. This confirms that these thin, high cirrus have a warming effect on the Earth's climate, whereas cooling clouds typically have lower cloud edges too low in altitude to satisfy the FLICA criterion of temperatures below −38 °C. We find CRFNET = 0.9 Wm−2 for Jungfraujoch and 1.0 Wm−2 (1.7 Wm−2) for Zürich (Jülich). Further, we calculate that subvisibe cirrus (τ < 0.03) contribute about 5 %, thin cirrus (0.03 < τ < 0.3) about 45 % and opaque cirrus (0.3 < τ) about 50 % of the total cirrus radiative forcing.


2005 ◽  
Vol 5 (1) ◽  
pp. 455-480
Author(s):  
A. Fotiadi ◽  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
K. G. Pavlakis ◽  
E. Drakakis ◽  
...  

Abstract. A decadal-scale trend in the tropical radiative energy budget has been observed recently by satellites, which however is not reproduced by climate models. In the present study, we have computed the outgoing shortwave radiation (OSR) at the top of atmosphere (TOA) at 2.5° longitude-latitude resolution and on a mean monthly basis for the 17-year period 1984–2000, by using a deterministic solar radiative transfer model and cloud climatological data from the International Satellite Cloud Climatology Project (ISCCP) D2 database. Atmospheric temperature and humidity vertical profiles, as well as other supplementary data, were taken from the National Centers for Environmental Prediction – National Center for Atmospheric Research (NCEP/NCAR) and the European Center for Medium-Range Weather Forecasts (ECMWF) Global Reanalysis Projects, while other global databases, such as the Global Aerosol Data Set (GADS) for aerosol data, were also used. Anomaly time series for the mean monthly pixel-level OSR fluxes, as well as for the key physical parameters, were constructed. A significant decreasing trend in OSR anomalies, starting mainly from the late 1980s, was found in tropical and subtropical regions (30° S–30° N), indicating an increase in solar planetary heating equal to 3.2±0.5 Wm-2 over the 17-year time period from 1984 to 2000 or 1.9±0.3 Wm-2/decade, reproducing well the features recorded by satellite observations, in contrast to climate model results. The model computed trend is in good agreement with the corresponding linear decrease of 3.7±0.5 Wm-2 (or 2.5±0.4 Wm-2/decade) in tropical mean OSR anomalies derived from ERBE S-10N non-scanner data. An attempt was made to identify the physical processes responsible for the decreasing trend in tropical mean OSR. A detailed correlation analysis using pixel-level anomalies of OSR flux and ISCCP cloud cover over the entire tropical and subtropical region (30° S–30° N), gave a correlation coefficient of 0.79, indicating that decreasing cloud cover is the main reason for the tropical OSR trend. According to the ISCCP-D2 data derived from the combined visible/infrared (VIS/IR) analysis, the tropical cloud cover has decreased by 6.6±0.2% per decade, in relative terms. A detailed analysis of the inter-annual and long-term variability of the various parameters determining the OSR at TOA, has shown that the most important contribution to the observed OSR trend comes from a decrease in low-level cloud cover over the period 1984–2000, followed by decreases in middle and high-level cloud cover. Opposite but small trends are introduced by increases in cloud scattering optical depth of low and middle clouds.


2020 ◽  
Vol 77 (2) ◽  
pp. 551-581
Author(s):  
Seung-Hee Ham ◽  
Seiji Kato ◽  
Fred G. Rose

Abstract Shortwave irradiance biases due to two- and four-stream approximations have been studied for the last couple of decades, but biases in estimating Earth’s radiation budget have not been examined in earlier studies. To quantify biases in diurnally averaged irradiances, we integrate the two- and four-stream biases using realistic diurnal variations of cloud properties from Clouds and the Earth’s Radiant Energy System (CERES) synoptic (SYN) hourly product. Three approximations are examined in this study: delta-two-stream-Eddington (D2strEdd), delta-two-stream-quadrature (D2strQuad), and delta-four-stream-quadrature (D4strQuad). Irradiances computed by the Discrete Ordinate Radiative Transfer model (DISORT) and Monte Carlo (MC) methods are used as references. The MC noises are further examined by comparing with DISORT results. When the biases are integrated with one day of solar zenith angle variation, regional biases of D2strEdd and D2strQuad reach up to 8 W m−2, while biases of D4strQuad reach up to 2 W m−2. When the biases are further averaged monthly or annually, regional biases of D2strEdd and D2strQuad can reach −1.5 W m−2 in SW top-of-atmosphere (TOA) upward irradiances and +3 W m−2 in surface downward irradiances. In contrast, regional biases of D4strQuad are within +0.9 for TOA irradiances and −1.2 W m−2 for surface irradiances. Except for polar regions, monthly and annual global mean biases are similar, suggesting that the biases are nearly independent to season. Biases in SW heating rate profiles are up to −0.008 K day−1 for D2strEdd and −0.016 K day−1 for D2strQuad, while the biases of the D4strQuad method are negligible.


2020 ◽  
Vol 12 (16) ◽  
pp. 2548
Author(s):  
Manting Zhang ◽  
Shiwen Teng ◽  
Di Di ◽  
Xiuqing Hu ◽  
Husi Letu ◽  
...  

Ice clouds play an important role in the Earth’s radiation budget, while their microphysical and optical properties remain one of the major uncertainties in remote sensing and atmospheric studies. Many satellite-based multi-spectral, -angle and -polarization instruments have been launched in recent years, and it is unclear how these observations can be used to improve the understanding of ice cloud properties. This study discusses the impacts of multi-spectral, -angle and -polarization observations on ice cloud property retrievals by performing a theoretical information content (IC) analysis. Ice cloud properties, including the cloud optical thickness (COT), particle effective radius (Re) and particle habit (defined by the aspect ratio (AR) and the degree of surface roughness level (σ)), are considered. An accurate polarized radiative transfer model is used to simulate the top-of-atmosphere intensity and polarized observations at the cloud-detecting wavelengths of interest. The ice cloud property retrieval accuracy should be improved with the additional information from multi-spectral, -angle and -polarization observations, which is verified by the increased degrees of freedom for signal (DFS). Polarization observations at spectral wavelengths (i.e., 0.87 and 2.13 µm) are helpful in the improvement of ice cloud property retrievals, especially for small-sized particles. An optimal scheme to retrieve ice cloud properties is to comprise radiance intensity information at the 0.87, 1.24, 1.64 and 2.13 µm channels and polarization information (the degree of linear polarization, DOLP) at the 0.87 and 2.13 µm channels. As observations from multiple angles added, DFS clearly increases, while it becomes almost saturated when the number of angles reaches three. Besides, the retrieval of Re exhibits larger uncertainties, and the improvement in total DFS by adding multi-spectral, -angle and -polarization observations is mainly attributed to the improvement of Re retrieval. Our findings will benefit the future instrument design and the improvement in cloud property retrieval algorithms based on multi-spectral, -angle, and -polarization imagers.


2011 ◽  
Vol 11 (11) ◽  
pp. 30009-30051 ◽  
Author(s):  
C. D. Papadimas ◽  
N. Hatzianastassiou ◽  
C. Matsoukas ◽  
M. Kanakidou ◽  
N. Mihalopoulos ◽  
...  

Abstract. For the first time, the direct radiative effect (DRE) of aerosols on solar radiation is computed over the entire Mediterranean basin, one of the most climatically sensitive world regions, by using a deterministic spectral radiation transfer model (RTM). The DRE effects on the outgoing shortwave radiation at the top of atmosphere (TOA), DRETOA, on the absorption of solar radiation in the atmospheric column, DREatm, and on the downward and absorbed surface solar radiation (SSR), DREsurf and DREnetsurf, respectively, are computed separately. The model uses input data for the period 2000–2007 for various surface and atmospheric parameters, taken from satellite (International Satellite Cloud Climatology Project, ISCCP-D2), Global Reanalysis projects (National Centers for Environmental Prediction – National Center for Atmospheric Research, NCEP/NCAR), and other global databases. The spectral aerosol optical properties (aerosol optical depth, AOD, asymmetry parameter, gaer and single scattering albedo, ωaer), are taken from the MODerate resolution Imaging Spectroradiometer (MODIS) of NASA (National Aeronautics and Space Administration) and they are Supplemented by the Global Aerosol Data Set (GADS). The model SSR fluxes have been successfully validated against measurements from 80 surface stations of the Global Energy Balance Archive (GEBA) covering the period 2000–2007. A planetary cooling is found above the Mediterranean on an annual basis (regional mean DRETOA = −2.4 Wm−2). Though planetary cooling is found over most of the region, up to −7 Wm−2, large positive DRETOA values (up to +25 Wm−2) are found over North Africa, indicating a strong planetary warming, as well as over the Alps (+0.5 Wm−2). Aerosols are found to increase the absorption of solar radiation in the atmospheric column over the region (DREatm = +11.1 Wm−2) and to decrease SSR (DREsurf = −16.5 Wm−2 and DREnetsurf −13.5 Wm−2) inducing thus significant atmospheric warming and surface radiative cooling. The calculated seasonal and monthly DREs are even larger, reaching −25.4 Wm−2 (for DREsurf). Sensitivity tests show that regional DREs are most sensitive to ωaer and secondarily to AOD, showing a quasi-linear dependence to these aerosol parameters.


2018 ◽  
Vol 37 ◽  
pp. 131-145
Author(s):  
Md Mijanur Rahman ◽  
Md Abdus Samad ◽  
SM Quamrul Hassan

An attempt has been made to simulate the thermodynamic features of the thunderstorm (TS) event over Dhaka (23.81°N, 90.41°E) occurred from 1300 UTC to 1320 UTC of 4 April 2015 using Advanced Research dynamics solver of Weather Research and Forecasting model (WRF-ARW). The model was run to conduct a simulation for 48 hours on a single domain of 5 km horizontal resolution utilizing six hourly Global Final Analysis (FNL) datasets from 0600 UTC of 3 April 2015 to 0600 UTC of 5 April 2015 as initial and lateral boundary conditions. Kessler schemes for microphysics, Yonsei University (YSU) scheme for planetary boundary layer (PBL) parametrization, Revised MM5 scheme for surface layer physics, Rapid Radiative Transfer Model (RRTM) for longwave radiation, Dudhia scheme for shortwave radiation and Kain–Fritsch (KF) scheme for cumulus parameterization were used. Hourly outputs produced by the model have been analyzed numerically and graphically using Grid Analysis and Display System (GrADS). Deep analyses were carried out by examining several thermodynamic parameters such as mean sea level pressure (MSLP), wind pattern, vertical wind shear, vorticity, temperature, convective available potential energy (CAPE), relative humidity (RH) and rainfall. To validate the model performance, simulated values of MSLP, maximum and minimum temperature and RH were compared with observational data obtained from Bangladesh Meteorological Department (BMD). Rainfall values were compared with that of BMD and Tropical Rainfall Measuring Mission (TRMM) of National Aeronautics and Space Administration (NASA). Based on the comparisons and validations, the present study advocates that the model captured the TS event reasonably well.GANIT J. Bangladesh Math. Soc.Vol. 37 (2017) 131-145


Sign in / Sign up

Export Citation Format

Share Document