scholarly journals The sdB pulsating star V391 Peg and its putative giant planet revisited after 13 years of time-series photometric data

2018 ◽  
Vol 611 ◽  
pp. A85 ◽  
Author(s):  
R. Silvotti ◽  
S. Schuh ◽  
S.-L. Kim ◽  
R. Lutz ◽  
M. Reed ◽  
...  

V391 Peg (alias HS 2201+2610) is a subdwarf B (sdB) pulsating star that shows both p- and g-modes. By studying the arrival times of the p-mode maxima and minima through the O–C method, in a previous article the presence of a planet was inferred with an orbital period of 3.2 years and a minimum mass of 3.2 MJup. Here we present an updated O–C analysis using a larger data set of 1066 h of photometric time series (~2.5× larger in terms of the number of data points), which covers the period between 1999 and 2012 (compared with 1999–2006 of the previous analysis). Up to the end of 2008, the new O–C diagram of the main pulsation frequency (f1) is compatible with (and improves) the previous two-component solution representing the long-term variation of the pulsation period (parabolic component) and the giant planet (sine wave component). Since 2009, the O–C trend of f1 changes, and the time derivative of the pulsation period (p.) passes from positive to negative; the reason of this change of regime is not clear and could be related to nonlinear interactions between different pulsation modes. With the new data, the O–C diagram of the secondary pulsation frequency (f2) continues to show two components (parabola and sine wave), like in the previous analysis. Various solutions are proposed to fit the O–C diagrams of f1 and f2, but in all of them, the sinusoidal components of f1 and f2 differ or at least agree less well than before. The nice agreement found previously was a coincidence due to various small effects that are carefully analyzed. Now, with a larger dataset, the presence of a planet is more uncertain and would require confirmation with an independent method. The new data allow us to improve the measurement of p. for f1 and f2: using only the data up to the end of 2008, we obtain p.1 = (1.34 ± 0.04) × 10−12 and p.2 = (1.62 ± 0.22) × 10−12. The long-term variation of the two main pulsation periods (and the change of sign of p.1) is visible also in direct measurements made over several years. The absence of peaks near f1 in the Fourier transform and the secondary peak close to f2 confirm a previous identification as l = 0 and l = 1, respectively, and suggest a stellar rotation period of about 40 days. The new data allow constraining the main g-mode pulsation periods of the star.

2021 ◽  
Author(s):  
Annette Dietmaier ◽  
Thomas Baumann

<p>The European Water Framework Directive (WFD) commits EU member states to achieve a good qualitative and quantitative status of all their water bodies.  WFD provides a list of actions to be taken to achieve the goal of good status.  However, this list disregards the specific conditions under which deep (> 400 m b.g.l.) groundwater aquifers form and exist.  In particular, deep groundwater fluid composition is influenced by interaction with the rock matrix and other geofluids, and may assume a bad status without anthropogenic influences. Thus, a new concept with directions of monitoring and modelling this specific kind of aquifers is needed. Their status evaluation must be based on the effects induced by their exploitation. Here, we analyze long-term real-life production data series to detect changes in the hydrochemical deep groundwater characteristics which might be triggered by balneological and geothermal exploitation. We aim to use these insights to design a set of criteria with which the status of deep groundwater aquifers can be quantitatively and qualitatively determined. Our analysis is based on a unique long-term hydrochemical data set, taken from 8 balneological and geothermal sites in the molasse basin of Lower Bavaria, Germany, and Upper Austria. It is focused on a predefined set of annual hydrochemical concentration values. The data range dates back to 1937. Our methods include developing threshold corridors, within which a good status can be assumed, and developing cluster analyses, correlation, and piper diagram analyses. We observed strong fluctuations in the hydrochemical characteristics of the molasse basin deep groundwater during the last decades. Special interest is put on fluctuations that seem to have a clear start and end date, and to be correlated with other exploitation activities in the region. For example, during the period between 1990 and 2020, bicarbonate and sodium values displayed a clear increase, followed by a distinct dip to below-average values and a subsequent return to average values at site F. During the same time, these values showed striking irregularities at site B. Furthermore, we observed fluctuations in several locations, which come close to disqualifying quality thresholds, commonly used in German balneology. Our preliminary results prove the importance of using long-term (multiple decades) time series analysis to better inform quality and quantity assessments for deep groundwater bodies: most fluctuations would stay undetected within a < 5 year time series window, but become a distinct irregularity when viewed in the context of multiple decades. In the next steps, a quality assessment matrix and threshold corridors will be developed, which take into account methods to identify these fluctuations. This will ultimately aid in assessing the sustainability of deep groundwater exploitation and reservoir management for balneological and geothermal uses.</p>


Open Physics ◽  
2009 ◽  
Vol 7 (3) ◽  
Author(s):  
Shahriar Shadkhoo ◽  
Fakhteh Ghanbarnejad ◽  
Gholam Jafari ◽  
Mohammad Tabar

AbstractIn this paper, we investigate the statistical and scaling properties of the California earthquakes’ inter-events over a period of the recent 40 years. To detect long-term correlations behavior, we apply detrended fluctuation analysis (DFA), which can systematically detect and overcome nonstationarities in the data set at all time scales. We calculate for various earthquakes with magnitudes larger than a given M. The results indicate that the Hurst exponent decreases with increasing M; characterized by a Hurst exponent, which is given by, H = 0:34 + 1:53/M, indicating that for events with very large magnitudes M, the Hurst exponent decreases to 0:50, which is for independent events.


2007 ◽  
Vol 7 (4) ◽  
pp. 11761-11796 ◽  
Author(s):  
S. Mieruch ◽  
S. Noël ◽  
H. Bovensmann ◽  
J. P. Burrows

Abstract. Global water vapour total column amounts have been retrieved from spectral data provided by the Global Ozone Monitoring Experiment (GOME) flying on ERS-2, which was launched in April 1995, and the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) onboard ENVISAT launched in March 2002. For this purpose the Air Mass Corrected Differential Optical Absorption Spectroscopy (AMC-DOAS) approach has been used. The combination of the data from both instruments provides us with a long-term global data set spanning more than 11 years with the potential of extension up to 2020 by GOME-2 data, on Metop. Using linear and non-linear methods from time series analysis and standard statistics the trends of H2O contents and their errors have been calculated. In this study, factors affecting the trend such as the length of the time series, the magnitude of the variability of the noise, and the autocorrelation of the noise are investigated. Special emphasis has been placed on the calculation of the statistical significance of the observed trends, which reveal significant local changes of water vapour columns distributed over the whole globe.


2018 ◽  
Author(s):  
Coline Mollaret ◽  
Christin Hilbich ◽  
Cécile Pellet ◽  
Adrian Flores-Orozco ◽  
Reynald Delaloye ◽  
...  

Abstract. Mountain permafrost is sensitive to climate change and is expected to gradually degrade in response to the ongoing atmospheric warming trend. Long-term monitoring the permafrost thermal state is a key task, but it is problematic where temperatures are close to 0 °C. The energy exchange is indeed often dominantly related to latent heat effects associated with phase change (ice/water), rather than ground warming or cooling. Consequently, it is difficult to detect significant spatio-temporal variations of ground properties (e.g. ice-water ratio) that occur during the freezing/thawing process with point scale temperature monitoring alone. Hence, electrical methods have become popular in permafrost investigations as the resistivities of ice and water differ by several orders of magnitude, theoretically allowing a clear distinction between frozen and unfrozen ground. In this study we present an assessment of mountain permafrost evolution using long-term electrical resistivity tomography monitoring (ERTM) from a network of permanent sites in the Central Alps. The time series consist of more than 1000 data sets from six sites, where resistivities have been measured on a regular basis for up to twenty years. We identify systematic sources of error and apply automatic filtering procedures during data processing. In order to constrain the interpretation of the results, we analyse inversion results and long-term resistivity changes in comparison with existing borehole temperature time series. Our results show that the resistivity data set provides the most valuable insights at the melting point. A prominent permafrost degradation trend is evident for the longest time series (19 years), but also detectable for shorter time series (about a decade) at most sites. In spite of the wide range of morphological, climatological and geological differences between the sites, the observed inter-annual resistivity changes and long-term tendencies are similar for all sites of the network.


2021 ◽  
Vol 39 (4) ◽  
pp. 627-640
Author(s):  
Attila Buzás ◽  
Veronika Barta ◽  
Tamás Horváth ◽  
József Bór

Abstract. In 2003, a decreasing trend was reported in the long-term (1962–2001) fair weather atmospheric electric potential gradient (PG) measured in the Széchenyi István Geophysical Observatory (NCK; 47∘38′ N, 16∘43′ E), Hungary, Central Europe. The origin of this reduction has been the subject of a long-standing debate, due to a group of trees near the measurement site which reached significant height since the measurements have started. Those trees have contributed to the lowering of the ambient vertical electric field due to their electrostatic shielding effect. In the present study, we attempt to reconstruct the true long-term variation of the vertical atmospheric electric field at NCK. The time-dependent shielding effect of trees at the measurement site was calculated to remove the corresponding bias from the recorded time series. A numerical model based on electrostatic theory was set up to take into account the electrostatic shielding of the local environment. The validity of the model was verified by on-site measurement campaigns. The changing height of the trees between 1962 and 2017 was derived from national-average age–height diagrams for each year. Modelling the time-dependent electrical shielding effect of the trees at NCK revealed that local effects played a pivotal role in the long-term decrease. The results suggest that earlier attempts could not quantify the shielding effect of the trees at NCK accurately. In this work it is found that the reconstructed PG time series at NCK exhibits an increase between 1962 and 1997 followed by a decaying trend since 1997. It is pointed out that long-term variation in summertime and wintertime PG averages should be analysed separately as these may contribute to trends in the annual mean values rather differently.


2020 ◽  
Author(s):  
Simone T. Andersen ◽  
Lucy J. Carpenter ◽  
Beth S. Nelson ◽  
Luis Neves ◽  
Katie A. Read ◽  
...  

Abstract. Atmospheric nitrogen oxides (NO + NO2 = NOx) have been measured at the Cape Verde Atmospheric Observatory (CVAO) in the tropical Atlantic (16° 51' N, 24° 52' W) since October 2006. These measurements represent a unique time series of NOx in the background remote troposphere. Nitrogen dioxide (NO2) is measured via photolytic conversion to nitric oxide (NO) by ultra violet light emitting diode arrays followed by chemiluminescence detection. Since the measurements began, a blue light converter (BLC) has been used for NO2 photolysis, with a maximum spectral output of 395 nm from 2006–2015 and of 385 nm from 2015. The original BLC used was constructed with a Teflon-like material and appeared to cause an overestimation of NO2 when illuminated. To avoid such interferences, a new additional photolytic converter (PLC) with a quartz photolysis cell (maximum spectral output also 385 nm) was implemented in March 2017. Once corrections are made for the NO2 artefact from the original BLC, the two NO2 converters are shown to give comparable NO2 mixing ratios (PLC = 0.92 × BLC, R2 = 0.92), giving confidence in the quantitative measurement of NOx at very low levels. Data analysis methods for the NOx measurements made at CVAO have been developed and applied to the entire time series to produce an internally consistent and high quality long-term data set. NO has a clear diurnal pattern with a maximum mixing ratio of 2–10 pptV during the day depending on the season and ~0 pptV during the night. NO2 shows a fairly flat diurnal signal, although a small increase in daytime NOx is evident in some months. Monthly average mixing ratios of NO2 vary between 5 and 30 pptV depending on the season. Clear seasonal trends in NO and NO2 levels can be observed with a maximum in autumn/winter and a minimum in spring/summer.


2020 ◽  
Author(s):  
Maria Staudinger ◽  
Stefan Seeger ◽  
Barbara Herbstritt ◽  
Michael Stoelzle ◽  
Jan Seibert ◽  
...  

Abstract. The stable isotopes of oxygen and hydrogen, 2H and 18O, provide information on water flow pathways and hydrologic catchment functioning. Here a data set of time series data on precipitation and streamflow isotope composition in Swiss medium-sized catchments, CH-IRP, is presented that is unique in terms of its long-term multi-catchment coverage along an alpine to pre-alpine gradient. The data set comprises fortnightly time series of both δ2H and δ18O as well as Deuterium excess from streamflow for 23 sites in Switzerland, together with summary statistics of the sampling at each station. Furthermore, time series of δ18O and δ2H in precipitation are provided for each catchment derived from interpolated datasets from the NISOT, GNIP and ANIP networks. For each station we compiled relevant metadata describing both the sampling conditions as well as catchment characteristics and climate infomation. Lab standards and errors are provided, and potentially problematic measurements are indicated to help the user decide on the applicability for individual study purposes. For the future, it is planned that the measurements will be continued at 14 stations as a long-term isotopic measurement network and the CH-IRP data set will, thus, be continuously be extended. The data set can be downloaded from data repository zenodo https://doi.org/10.5281/zenodo.3659679 (Staudinger et al., 2020).


2020 ◽  
Author(s):  
Kai-Lan Chang ◽  
Owen R Cooper ◽  
Audrey Gaudel ◽  
Irina Petropavlovskikh ◽  
Valerie Thouret

Abstract. Detecting a tropospheric ozone trend from sparsely sampled ozonesonde profiles (typically once per week) is challenging due to the noise in the time series resulting from ozone's high temporal variability. To enhance trend detection we have developed a sophisticated statistical approach that utilizes a geoadditive model to assess ozone variability across a time series of vertical profiles. Treating the profile time series as a set of individual time series on discrete pressure surfaces, a class of smoothing spline ANOVA (analysis of variance) models is used for the purpose of jointly modeling multiple correlated time series (on separate pressure surfaces) by their associated seasonal and interannual variabilities. This integrated fit method filters out the unstructured noise through a statistical regularization (i.e. a roughness penalty), by taking advantage of the additional correlated data points available on the pressure surfaces above and below the surface of interest. We have applied this technique to the trend analysis of the vertically correlated time series of tropospheric ozone observations from 1) IAGOS (In-service Aircraft for a Global Observing System) commercial aircraft profiles above Europe and China, and 2) NOAA GMD's (Global Monitoring Division) ozonesonde records at Hilo, Hawaii and Trinidad Head, California. We illustrate the ability of this technique to detect a consistent trend estimate, and its effectiveness for reducing the associated uncertainty in the noisy profile data due to low sampling frequency. We also conducted a sensitivity analysis of frequent IAGOS profiles above Europe (approximately 120 profiles per month) to determine how many profiles in a month are required for reliable long-term trend detection. When ignoring the vertical correlation we found that a typical sampling strategy of 4 profiles-per-month results in 7 % of sampled trends falling outside the 2-sigma uncertainty interval derived from the full data set, with associated 10 % of mean absolute percentage error. We determined that an optimal sampling frequency is 14 profiles per month when using the integrated fit method for calculating trends; when the integrated fit method is not applied, the sampling frequency had to be increased to 18 profiles per month to achieve the same result. While our method improves trend detection from sparse data sets, the key to substantially reducing the uncertainty is to increase the sampling frequency.


2021 ◽  
Author(s):  
Ole Einar Tveito

<p>For many purposes, including the estimation of climate normals, requires long, continuous  and preferably homogeneous time series. Many observation series do not meet these requirements, especially due to modernisation and automation of the observation network. Despite the lack of long series there is still a need to provide climate parameters representing a longer time period than available. An actual problem is the calculation of new standard climate normals for the 1991-2020 period, where normal values need to be assigned also for observation series not meeting the requirements of WMO to estimate climate normals from observations. </p><p>One possible approach to estimate monthly time series is to extract value from gridded climate anomaly fields. In this study this approach is applied to complete time series that will be the basis for calculation of long term reference values.</p><p>The calculation of the long term time series is a two step procedure. First monthly anomaly grids based on homogenised data series are produced. The homogenized series provide more stable and reliable spatial estimates than applying non homogenised data. The homogenised data set is also complete ensuring a spatially consistent input throughout the analysis period 1991-2020.</p><p>The monthly anomalies for the location of the series to be complete are extracted from the gridded fields. By combining the interpolated anomalies with the observations the long term mean value can be estimated. The study shows that this approach provides reliable estimates of long term values, even with just a few events for calibration. The precision of the estimates depend more on the representativity of the grid estimates than length of the observation series. At locations where the anomaly grids represent the spatial climate variability well, stable estimates are achieved. On the other hand will the estimates at locations where the anomaly grids are less accurate due to sparse data coverage or steep climate gradients lead to estimates with a larger variability, and  thus more uncertain estimates. </p>


Sign in / Sign up

Export Citation Format

Share Document