scholarly journals Planetary waves in ozone and temperature in the Northern Hemisphere winters of 2002/2003 and early 2005

2009 ◽  
Vol 27 (3) ◽  
pp. 1189-1206 ◽  
Author(s):  
A. Belova ◽  
S. Kirkwood ◽  
D. Murtagh

Abstract. Temperature and ozone data from the sub-millimetre radiometer (SMR) installed aboard the Odin satellite have been examined to study the relationship between temperature and ozone concentration in the lower and upper stratosphere in winter time. The retrieved ozone and temperature profiles have been considered between the range of 24–46 km during the Northern Hemisphere (NH) winter of December 2002 to March 2003 and January to March 2005. A comparison between the ozone mixing ratio and temperature fields has been made for the zonal means, wavenumber one variations and 5-day planetary waves. The amplitude values in temperature variations are ~5 K in the wavenumber one and 0.5–1 K in the 5-day wave. In ozone mixing ratio, the amplitudes reach ~0.5 ppmv in the wavenumber one and 0.05–0.1 ppmv in the 5-day wave. Several stratospheric warming events were observed during the NH winters of 2002/2003 and early 2005. Along with these warming events, amplification of the amplitude has been detected in wavenumber one (up to 30 K in temperature and 1.25 ppmv in ozone) and partly in the 5-day perturbation (up to 2 K in temperature and 0.2 ppmv in ozone). In general, the results show the expected in-phase behavior between the temperature and ozone fields in the lower stratosphere due to dynamic effects, and an out-of-phase pattern in the upper stratosphere, which is expected as a result of photochemical effects. However, these relationships are not valid for zonal means and wavenumber one components when the wave amplitudes are changing dramatically during the strongest stratospheric warming event (at the end of December 2002/beginning of January 2003). Also, for several shorter intervals, the 5-day perturbations in ozone and temperature are not well-correlated at lower heights, particularly when conditions change rapidly. Odin's basic observation schedule provides stratosphere mode data every third day and to validate the reliability of the 5-day waves extracted from the Odin measurements, additional independent data have been analysed in this study: temperature assimilation data by the European Centre for Medium-range Weather Forecasts (ECMWF) for the NH winter of 2002/2003, and satellite measurements of temperature and ozone by the Microwave Limb Sounder (MLS) on board the Aura satellite for the NH winter in early 2005. Good agreement between the temperature fields from Odin and ECMWF data is found at middle latitude where, in general, the 5-day perturbations from the two data sets coincide in both phase and amplitude throughout the examined interval. Analysis of the wavenumber one and the 5-day wave perturbations in temperature and ozone fields from Odin and from Aura demonstrates that, for the largest part of the examined period, quite similar characteristics are found in the spatial and temporal domain, with slightly larger amplitude values seen by Aura. Hence, the comparison between the Odin data, sampled each third day, and daily data from Aura and the ECMWF shows that the Odin data are sufficiently reliable to estimate the properties of the 5-day oscillations, at least for the locations and time intervals with strong wave activity.

2021 ◽  
Author(s):  
Elisabeth Blanc ◽  
Patrick Hupe ◽  
Bernd Kaifler ◽  
Natalie Kaifler ◽  
Alexis Le Pichon ◽  
...  

<p>The uncertainties in the infrasound technology arise from the middle atmospheric disturbances, which are partly underrepresented in the atmospheric models such as in the European Centre for Medium-Range Weather Forecasts (ECMWF) products used for infrasound propagation simulations. In the framework of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project, multi-instrument observations are performed to provide new data sets for model improvement and future assimilations. In an unexpected way, new observations using the autonomous CORAL lidar showed significant differences between ECMWF analysis fields and observations in Argentina in the period range between 0.1 and 10 days. The model underestimates the wave activity, especially in the summer. During the same season, the infrasound bulletins of the IS02 station in Argentina indicate the presence of two prevailing directions of the detections, which are not reflected by the simulations. Observations at the Haute Provence Observatory (OHP) are used for comparison in different geophysical conditions. The origin of the observed anomalies are discussed in term of planetary waves effect on the infrasound propagation.</p>


2012 ◽  
Vol 8 (3) ◽  
pp. 2409-2444 ◽  
Author(s):  
O. Bothe ◽  
J. H. Jungclaus ◽  
D. Zanchettin ◽  
E. Zorita

Abstract. Are simulations and reconstructions of past climate and its variability comparable with each other? We assess if simulations and reconstructions are consistent under the paradigm of a statistically indistinguishable ensemble. Ensemble consistency is assessed for Northern Hemisphere mean temperature, Central European mean temperature and for global temperature fields for the climate of the last millennium. Reconstructions available for these regions are evaluated against the simulation data from the community simulations of the climate of the last millennium performed at the Max Planck Institute for Meteorology. The distributions of ensemble simulated temperatures are generally too wide at most locations and on most time-scales relative to the employed reconstructions. Similarly, an ensemble of reconstructions is too wide when evaluated against the simulation ensemble mean. Probabilistic and climatological ensemble consistency is limited to sub-domains and sub-periods. Only the ensemble simulated and reconstructed annual Central European mean temperatures for the second half of the last millennium demonstrates consistency. The lack of consistency found in our analyses implies that, on the basis of the studied data sets, no status of truth can be assumed for climate evolutions on the considered spatial and temporal scales and, thus, assessing the accuracy of reconstructions and simulations is so far of limited feasibility in pre-instrumental periods.


2017 ◽  
Author(s):  
Sonja Molnos ◽  
Stefan Petri ◽  
Jascha Lehmann ◽  
Erik Peukert ◽  
Dim Coumou

Abstract. Climate and weather conditions in the mid-latitudes are strongly driven by the large-scale atmosphere circulation. Observational data indicates that important components of the large-scale circulation have changed in recent decades including the strength of the Hadley cell, jet streams, storm tracks and planetary waves. Associated impacts cover a broad range, including changes in the frequency and nature of weather extremes and shifts of fertile habitats with implications for biodiversity and agriculture. Dynamical theories have been proposed that link the shift of the poleward edge of the Northern Hadley cell to changes in the meridional temperature gradient. Moreover, model simulations have been carried out to analyse the cause of observed and projected changes in the large-scale atmosphere circulation. However, the question of the underlying drivers and particularly the possible role of global warming is still debated. Here, we use a statistical-dynamical atmosphere model (SDAM) to analyse the sensitivity of the Northern Hemisphere Hadley cell, storm tracks, jet streams and planetary waves to changes in temperature fields by systematically altering the zonal and meridional temperature gradient as well as the global mean surface temperature.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 484
Author(s):  
Claudiu Vințe ◽  
Marcel Ausloos ◽  
Titus Felix Furtună

Grasping the historical volatility of stock market indices and accurately estimating are two of the major focuses of those involved in the financial securities industry and derivative instruments pricing. This paper presents the results of employing the intrinsic entropy model as a substitute for estimating the volatility of stock market indices. Diverging from the widely used volatility models that take into account only the elements related to the traded prices, namely the open, high, low, and close prices of a trading day (OHLC), the intrinsic entropy model takes into account the traded volumes during the considered time frame as well. We adjust the intraday intrinsic entropy model that we introduced earlier for exchange-traded securities in order to connect daily OHLC prices with the ratio of the corresponding daily volume to the overall volume traded in the considered period. The intrinsic entropy model conceptualizes this ratio as entropic probability or market credence assigned to the corresponding price level. The intrinsic entropy is computed using historical daily data for traded market indices (S&P 500, Dow 30, NYSE Composite, NASDAQ Composite, Nikkei 225, and Hang Seng Index). We compare the results produced by the intrinsic entropy model with the volatility estimates obtained for the same data sets using widely employed industry volatility estimators. The intrinsic entropy model proves to consistently deliver reliable estimates for various time frames while showing peculiarly high values for the coefficient of variation, with the estimates falling in a significantly lower interval range compared with those provided by the other advanced volatility estimators.


2021 ◽  
Author(s):  
Martin H. Trauth ◽  
Asfawossen Asrat ◽  
Nadine Berner ◽  
Faysal Bibi ◽  
Verena Foerster ◽  
...  

<p>The hypothesis of a connection between the onset (or intensification) of Northern Hemisphere Glaciation (NHG), the stepwise increase in African aridity (and climate variability) and an important mammalian (including hominin) species turnover is a textbook example of the initiation of a scientific idea and its propagation in science. It is, however, also an example of the persistent popularity of a hypothesis despite mounting evidence against it. The first part of our work analyzes of the history of the scientific idea by seeking its roots, including coincidental meetings and exchanges between of scientists, at project meetings, conferences and workshops. The consequences of this idea are examined and its influence on subsequent scientific investigations both before and after it has been falsified. In the second part of our investigation, we examine why the idea that the high latitudes have a major control on the climate of the low latitudes and thus early human evolution persists. For this purpose, an attempt is made to understand the original interpretation of the data, with special consideration of the composition of the scientific team and their scientific backgrounds and persuasions. Some of the key records in support of the hypothesis of a step-wise transition will be statistically re-analyzed by fitting change-point models to the time series to determine the midpoint and duration of the transition – in case such a transition is found in the data. A critical review of key publications in support of such a connection and a statistical re-analysis of key data sets leads to three conclusions: (1) Northern Hemisphere Glaciation is a gradual process between ~3.5–2.5 Ma, not an abrupt onset, either at ~2.5 Ma, nor at ~2.8 Ma, or any other time in the Late Cenozoic Era, (2) the trend towards greater aridity in Africa during this period was also gradual, not stepwise in the sense of a consistent transition of a duration of ≤0.2 Ma, and (3) accordingly, a step-wise change in environmental conditions cannot be used to explain an important mammalian (including hominin) species turnover.</p>


Author(s):  
Abou_el_ela Abdou Hussein

Day by day advanced web technologies have led to tremendous growth amount of daily data generated volumes. This mountain of huge and spread data sets leads to phenomenon that called big data which is a collection of massive, heterogeneous, unstructured, enormous and complex data sets. Big Data life cycle could be represented as, Collecting (capture), storing, distribute, manipulating, interpreting, analyzing, investigate and visualizing big data. Traditional techniques as Relational Database Management System (RDBMS) couldn’t handle big data because it has its own limitations, so Advancement in computing architecture is required to handle both the data storage requisites and the weighty processing needed to analyze huge volumes and variety of data economically. There are many technologies manipulating a big data, one of them is hadoop. Hadoop could be understand as an open source spread data processing that is one of the prominent and well known solutions to overcome handling big data problem. Apache Hadoop was based on Google File System and Map Reduce programming paradigm. Through this paper we dived to search for all big data characteristics starting from first three V's that have been extended during time through researches to be more than fifty six V's and making comparisons between researchers to reach to best representation and the precise clarification of all big data V’s characteristics. We highlight the challenges that face big data processing and how to overcome these challenges using Hadoop and its use in processing big data sets as a solution for resolving various problems in a distributed cloud based environment. This paper mainly focuses on different components of hadoop like Hive, Pig, and Hbase, etc. Also we institutes absolute description of Hadoop Pros and cons and improvements to face hadoop problems by choosing proposed Cost-efficient Scheduler Algorithm for heterogeneous Hadoop system.


2013 ◽  
Vol 13 (22) ◽  
pp. 11221-11234 ◽  
Author(s):  
F. Arfeuille ◽  
B. P. Luo ◽  
P. Heckendorn ◽  
D. Weisenstein ◽  
J. X. Sheng ◽  
...  

Abstract. In terms of atmospheric impact, the volcanic eruption of Mt. Pinatubo (1991) is the best characterized large eruption on record. We investigate here the model-derived stratospheric warming following the Pinatubo eruption as derived from SAGE II extinction data including recent improvements in the processing algorithm. This method, termed SAGE_4λ, makes use of the four wavelengths (385, 452, 525 and 1024 nm) of the SAGE II data when available, and uses a data-filling procedure in the opacity-induced "gap" regions. Using SAGE_4λ, we derived aerosol size distributions that properly reproduce extinction coefficients also at much longer wavelengths. This provides a good basis for calculating the absorption of terrestrial infrared radiation and the resulting stratospheric heating. However, we also show that the use of this data set in a global chemistry–climate model (CCM) still leads to stronger aerosol-induced stratospheric heating than observed, with temperatures partly even higher than the already too high values found by many models in recent general circulation model (GCM) and CCM intercomparisons. This suggests that the overestimation of the stratospheric warming after the Pinatubo eruption may not be ascribed to an insufficient observational database but instead to using outdated data sets, to deficiencies in the implementation of the forcing data, or to radiative or dynamical model artifacts. Conversely, the SAGE_4λ approach reduces the infrared absorption in the tropical tropopause region, resulting in a significantly better agreement with the post-volcanic temperature record at these altitudes.


2017 ◽  
Vol 13 (S335) ◽  
pp. 58-64 ◽  
Author(s):  
Hebe Cremades

AbstractSophisticated instrumentation dedicated to studying and monitoring our Sun’s activity has proliferated in the past few decades, together with the increasing demand of specialized space weather forecasts that address the needs of commercial and government systems. As a result, theoretical and empirical models and techniques of increasing complexity have been developed, aimed at forecasting the occurrence of solar disturbances, their evolution, and time of arrival to Earth. Here we will review groundbreaking and recent methods to predict the propagation and evolution of coronal mass ejections and their driven shocks. The methods rely on a wealth of data sets provided by ground- and space-based observatories, involving remote-sensing observations of the corona and the heliosphere, as well as detections of radio waves.


2021 ◽  
Author(s):  
Beatrix Izsák ◽  
Mónika Lakatos ◽  
Rita Pongrácz ◽  
Tamás Szentimrey ◽  
Olivér Szentes

<p>Climate studies, in particular those related to climate change, require long, high-quality, controlled data sets that are representative both spatially and temporally. Changing the conditions in which the measurements were taken, for example relocating the station, or a change in the frequency and time of measurements, or in the instruments used may result in an fractured time series. To avoid these problems, data errors and inhomogeneities are eliminated for Hungary and data gaps are filled in by using the MASH (Multiple Analysis of Series for Homogenization, Szentimrey) homogenization procedure. Homogenization of the data series raises the problem that how to homogenize long and short data series together within the same process, since the meteorological observation network was upgraded significantly in the last decades. It is possible to solve these problems with the method MASH due to its adequate mathematical principles for such purposes. The solution includes the synchronization of the common parts’ inhomogeneities within three (or more) different MASH processing of the three (or more) datasets with different lengths. Then, the homogenized station data series are interpolated to the whole area of Hungary, to a 0.1 degree regular grid. For this purpose, the MISH (Meteorological Interpolation based on Surface Homogenized Data Basis; Szentimrey and Bihari) program system is used. The MISH procedure was developed specifically for the interpolation of various meteorological elements. Hungarian time series of daily average temperature and precipitation sum for the period 1870-2020 were used in this study, thus providing the longest homogenized, gridded daily data sets in the region with up-to-date information already included.</p><p><em>Supported by the ÚNKP-20-3 New National Excellence Program of the Ministry for Innovation andTechnology from the source of the National Research, Development and Innovation Fund.</em></p>


Sign in / Sign up

Export Citation Format

Share Document