Long-term ZTD and ZWD series and climate normals using NCEP1

Author(s):  
Marcelo C. Santos ◽  
Marlon Moura ◽  
Thalia Nikolaidou ◽  
Kyriakos Balidakis

<p>The World Meteorological Organization (WMO) recommends the use of climate normals for dealing with the analysis of variations and trends of the meteorological parameters or be used as input to predictive climate models. The suggested period is 30 years, but shorter periods can also be employed. We computed zenith total delay (ZTD) and zenith wet delay (ZWD) series for each node of NCEP1 numerical weather model, starting in 1948. We computed climate normals of those two parameters using periods of 1, 5, 10, 15, 20 and 30 years, with and without the annual signature. To assess window size impact, we looked at variations and correlation of trends derived from the various solutions. Results shows the obvious better smoothing using larger windows and the decrease of the impact of annual signature. Regions with positive trends appear to be concentrated in continental masses and the equator line, and the most significant negative trends are in the oceans. ZTD increase is caused primarily by an increase in ZWD and is an indication of variations in ZWD variables. In the case of water vapor, such an increase in ZWD shows us a probable increase in the amount of water vapor in the atmosphere. Comparisons with trends computed from GNSS-derived ZTD and ZWD series are included with the caveat that time period for such comparisons must be shorter.</p>

2017 ◽  
Vol 17 (13) ◽  
pp. 8031-8044 ◽  
Author(s):  
Kevin M. Smalley ◽  
Andrew E. Dessler ◽  
Slimane Bekki ◽  
Makoto Deushi ◽  
Marion Marchand ◽  
...  

Abstract. Variations in tropical lower-stratospheric humidity influence both the chemistry and climate of the atmosphere. We analyze tropical lower-stratospheric water vapor in 21st century simulations from 12 state-of-the-art chemistry–climate models (CCMs), using a linear regression model to determine the factors driving the trends and variability. Within CCMs, warming of the troposphere primarily drives the long-term trend in stratospheric humidity. This is partially offset in most CCMs by an increase in the strength of the Brewer–Dobson circulation, which tends to cool the tropical tropopause layer (TTL). We also apply the regression model to individual decades from the 21st century CCM runs and compare them to a regression of a decade of observations. Many of the CCMs, but not all, compare well with these observations, lending credibility to their predictions. One notable deficiency is that most CCMs underestimate the impact of the quasi-biennial oscillation on lower-stratospheric water vapor. Our analysis provides a new and potentially superior way to evaluate model trends in lower-stratospheric humidity.


Author(s):  
Takrima Sayeda

The purpose of the paper is to see if there is any relationship exist between free floating exchange rate and export performance of Bangladesh. It inspects the monthly data of exchange rate and export value for the time period between year 2000 and 2017. It utilized the Johansen [1] cointegration approach to identify the extent of long run and short run relationship between them. The study could not establish neither any long term trend nor any short term dynamics between the variables. Respective variables are significantly related to their own immediate past values. Distant past values do not have any implications. This study suggests that short run macroeconomic policy would be beneficial to influence the foreign exchange market and eventually the performance of export of Bangladesh.


2016 ◽  
Vol 9 (9) ◽  
pp. 4861-4877 ◽  
Author(s):  
Zofia Baldysz ◽  
Grzegorz Nykiel ◽  
Andrzej Araszkiewicz ◽  
Mariusz Figurski ◽  
Karolina Szafranek

Abstract. The main purpose of this research was to acquire information about consistency of ZTD (zenith total delay) linear trends and seasonal components between two consecutive GPS reprocessing campaigns. The analysis concerned two sets of the ZTD time series which were estimated during EUREF (Reference Frame Sub-Commission for Europe) EPN (Permanent Network) reprocessing campaigns according to 2008 and 2015 MUT AC (Military University of Technology Analysis Centre) scenarios. Firstly, Lomb–Scargle periodograms were generated for 57 EPN stations to obtain a characterisation of oscillations occurring in the ZTD time series. Then, the values of seasonal components and linear trends were estimated using the LSE (least squares estimation) approach. The Mann–Kendall trend test was also carried out to verify the presence of linear long-term ZTD changes. Finally, differences in seasonal signals and linear trends between these two data sets were investigated. All these analyses were conducted for the ZTD time series of two lengths: a shortened 16-year series and a full 18-year one. In the case of spectral analysis, amplitudes of the annual and semi-annual periods were almost exactly the same for both reprocessing campaigns. Exceptions were found for only a few stations and they did not exceed 1 mm. The estimated trends were also similar. However, for the reprocessing performed in 2008, the trends values were usually higher. In general, shortening of the analysed time period by 2 years resulted in a decrease of the linear trends values of about 0.07 mm yr−1. This was confirmed by analyses based on two data sets.


2019 ◽  
Vol 10 (2) ◽  
pp. 333-345 ◽  
Author(s):  
Lennert B. Stap ◽  
Peter Köhler ◽  
Gerrit Lohmann

Abstract. The equilibrium climate sensitivity (ECS) of climate models is calculated as the equilibrium global mean surface air warming resulting from a simulated doubling of the atmospheric CO2 concentration. In these simulations, long-term processes in the climate system, such as land ice changes, are not incorporated. Hence, climate sensitivity derived from paleodata has to be compensated for these processes, when comparing it to the ECS of climate models. Several recent studies found that the impact these long-term processes have on global temperature cannot be quantified directly through the global radiative forcing they induce. This renders the prevailing approach of deconvoluting paleotemperatures through a partitioning based on radiative forcings inaccurate. Here, we therefore implement an efficacy factor ε[LI] that relates the impact of land ice changes on global temperature to that of CO2 changes in our calculation of climate sensitivity from paleodata. We apply our refined approach to a proxy-inferred paleoclimate dataset, using ε[LI]=0.45-0.20+0.34 based on a multi-model assemblage of simulated relative influences of land ice changes on the Last Glacial Maximum temperature anomaly. The implemented ε[LI] is smaller than unity, meaning that per unit of radiative, forcing the impact on global temperature is less strong for land ice changes than for CO2 changes. Consequently, our obtained ECS estimate of 5.8±1.3 K, where the uncertainty reflects the implemented range in ε[LI], is ∼50 % higher than when differences in efficacy are not considered.


2019 ◽  
Vol 12 (9) ◽  
pp. 5087-5099 ◽  
Author(s):  
Jonathan K. P. Shonk ◽  
Jui-Yuan Christine Chiu ◽  
Alexander Marshak ◽  
David M. Giles ◽  
Chiung-Huei Huang ◽  
...  

Abstract. Clouds present many challenges to climate modelling. To develop and verify the parameterisations needed to allow climate models to represent cloud structure and processes, there is a need for high-quality observations of cloud optical depth from locations around the world. Retrievals of cloud optical depth are obtainable from radiances measured by Aerosol Robotic Network (AERONET) radiometers in “cloud mode” using a two-wavelength retrieval method. However, the method is unable to detect cloud phase, and hence assumes that all of the cloud in a profile is liquid. This assumption has the potential to introduce errors into long-term statistics of retrieved optical depth for clouds that also contain ice. Using a set of idealised cloud profiles we find that, for optical depths above 20, the fractional error in retrieved optical depth is a linear function of the fraction of the optical depth that is due to the presence of ice cloud (“ice fraction”). Clouds that are entirely ice have positive errors with magnitudes of the order of 55 % to 70 %. We derive a simple linear equation that can be used as a correction at AERONET sites where ice fraction can be independently estimated. Using this linear equation, we estimate the magnitude of the error for a set of cloud profiles from five sites of the Atmospheric Radiation Measurement programme. The dataset contains separate retrievals of ice and liquid retrievals; hence ice fraction can be estimated. The magnitude of the error at each location was related to the relative frequencies of occurrence in thick frontal cloud at the mid-latitude sites and of deep convection at the tropical sites – that is, of deep cloud containing both ice and liquid particles. The long-term mean optical depth error at the five locations spans the range 2–4, which we show to be small enough to allow calculation of top-of-atmosphere flux to within 10 % and surface flux to about 15 %.


2020 ◽  
Author(s):  
Franz-Josef Lübken ◽  
Gerd Baumgarten

<p>Some of the earliest observations in the transition region between the Earth's atmosphere and space (roughly at 80-120km) come from so called `noctilucent clouds' (NLC) which are located around 83km altitude and consist of water ice particles. They owe their existence to the very cold summer mesopause region (~130K) at mid and high latitudes. There is a long standing dispute whether NLC are indicators of climate change in the middle atmosphere. We use model simulations of the background atmosphere and of ice particle formation for a time period of 138 years to show that an increase of NLC appearance is expected for recent decades due to increased anthropogenic release of methane being oxidized to water vapor in the middle atmosphere. Since the beginning of industrialization the water vapor concentration at NLC heights has presumably increased by about 40 percent (1 ppmv). The water vapor increase leads to a large enhancement of NLC brightness. Increased cooling by enhanced carbon dioxide alone (assuming no water vapor increase) counter-intuitively would lead to a decrease(!) of NLC brightness. NLC existed presumably since centuries, but the chance to observe them by naked eye was very small before the 20th century, whereas it is likely to see an NLC in the modern era. The eruption of volcano Krakatoa in 1883 has seemingly triggered the first observation of an NLC in 1885. In this presentation we extend our analysis from middle to polar latitudes and expand comparison with observations.</p>


2020 ◽  
Author(s):  
Francois Costard ◽  
José Alexis Palmero Rodriguez ◽  
Antoine Séjourné ◽  
Anthony Lagain ◽  
Steve Clifford ◽  
...  

<p>The duration and timing of a northern ocean is a key issue in understanding the past geological and climatic evolution of Mars. Mars experienced its greatest loss of H<sub>2</sub>O between the Noachian and Late Hesperian (~10 m Global Equivalent Layer, Jakosky et al., 2017) roughly the same amount that is thought to have been added to the global inventory by extrusive volcanism over the same time period (Carr and Head, 2015). Thus, the total inventory of water was probably similar during these two epochs. But, the ocean during the Late Hesperian was smaller in extension than the ocean during the Noachian– with significant implications for the potential origin and survival of life. Here we examine the implications of the existence of a Late Hesperian/ Early Amazonian ocean on the planet’s inventory of water (and especially liquid water) and its variation with time. Our previous work (Rodriguez et al., 2016; Costard et al., 2017) concluded that the most plausible explanation for the origin of the Thumbprint Terrain (TT) lobate deposits, with run-ups, found along the dichotomy boundary, especially in Arabia Terra, was tsunami deposits. This supports the hypothesis that an ocean occupied the northern plains of Mars as recently as ~3 billion years ago. Furthermore, Costard et al (2017) produced a tsunami numerical model showing that the TT deposits exhibit fine-scale textural patterns due to the wave’s interference patterns resulting from interactions with the coastal topography. More recently, we suggested that the unusual characteristics of Lomonosov crater (50.52°N/16.39°E ) in the northern plains are best explained by the presence of a shallow ocean at the time of the impact (Costard et al., 2019). Interestingly, the apparent agreement between the age of the Lomonosov impact and that of the TT unit (~3 Ga), strongly suggests that it was the source of the tsunami (Costard et al., 2019). Our preliminary assessment indicates that this impact-generated tsunami required a mostly liquid ocean and because of the high latitude location of the Lomonosov crater site, our results strongly imply relatively warm paleoclimatic conditions. Our conclusions highlight the need for more sophisticated climate models.</p>


2012 ◽  
Vol 263-266 ◽  
pp. 125-130
Author(s):  
Yan Ping Wang

Short-term load forecasting is one of the most important routine works for power dispatch departments. The accuracy of load forecasting will exert direct effects on the safety, economy and stabilization of the power system running. Portrait and transverse comparability are employed to distinguish and correct bad load data, while wavelet analysis and multiple-time-period analysis used to eliminate long-term increasing weights, thus reducing the impact of the high-speed load increase on the accuracy of load forecasting.


2021 ◽  
Author(s):  
Marina Martinez-Garcia ◽  
Alejandro Rabasa ◽  
Xavier Barber ◽  
Kristina Polotskaya ◽  
Kristof Roomp ◽  
...  

Population confinements have been one of the most widely adopted non-pharmaceutical interventions (NPIs) implemented by governments across the globe to help contain the spread of the SARS-CoV-2 virus. While confinement measures have been proven to be effective to reduce the number of infections, they entail significant economic and social costs. Thus, different policy makers and social groups have exhibited varying levels of acceptance of this type of measures. In this context, understanding the factors that determine the willingness of individuals to be confined during a pandemic is of paramount importance, particularly, to policy and decision-makers. In this paper, we study the factors that influence the unwillingness to be confined during the COVID-19 pandemic by means of a large-scale, online population survey deployed in Spain. We apply both quantitative (logistic regression) and qualitative (automatic pattern discovery) methods and consider socio-demographic, economic and psychological factors, together with the 14-day cumulative incidence per 100,000 inhabitants. Our analysis of 109,515 answers to the survey covers data spanning over a 5-month time period to shed light on the impact of the passage of time. We find evidence of pandemic fatigue as the percentage of those who report an unwillingness to be in confinement increases over time; we identify significant gender differences, with women being generally less likely than men to be able to sustain long-term confinement of at least 6 months; we uncover that the psychological impact was the most important factor to determine the willingness to be in confinement at the beginning of the pandemic, to be replaced by the economic impact as the most important variable towards the end of our period of study. Our results highlight the need to design gender and age specific public policies, to implement psychological and economic support programs and to address the evident pandemic fatigue as the success of potential future confinements will depend on the population's willingness to comply with them.


2016 ◽  
Vol 12 (2) ◽  
pp. 28-33
Author(s):  
M. Eugenia Pérez-Pons ◽  
Alfonso González-Briones ◽  
Juan M. Corchado

The following work presents a methodology of determining the economic value of the data owned by a company in a given time period. The ability to determine the value of data at any point of its lifecycle, would make it possible to study the added value that data gives to a company in the long term. Not only external data should be considered but also the impact that the internal data can have on company revenues. The project focuses on data-driven companies, which are different to the data-oriented ones, as explained below. Since some studies affirm that data-driven companies are more profitable, the indirect costs of using those data must be allocated somewhere to understand their financial value14 and to present a possible alternative for measuring the financial impact of data on the revenue of companies.


Sign in / Sign up

Export Citation Format

Share Document