Closeness-centrality-correlation for detecting interdependency between coupled systems

2021 ◽  
pp. 2150216
Author(s):  
Guangyu Yang ◽  
Daolin Xu ◽  
Haicheng Zhang

Generalized synchronization is a common interdependency between coupled systems which exists in many branches of life, social and physical science. In this paper, a novel method, called closeness-centrality-correlation is proposed for the detection of this interdependency. The proposed method is based on a global network measure (i.e., closeness centrality) of recurrence networks resulting from time series. We illustrate the feasibility of the proposed method using a paradigmatic coupled model and compare its performance to other commonly used interdependency methods. The numerical results show that the proposed method is quite satisfactory for detecting interdependency and outperforms the existing joint probability of recurrence method especially for the case that the dynamics of the two coupled subsystems are significantly different. Moreover, through analyzing the time series contaminated by white noise, we demonstrate that our method is robust against white noise. Finally, an application to recorded electroencephalogram data shows that the proposed measure is more reliable to detect the transitions of the interdependencies among the noisy electroencephalogram time series and thus provides longer pre-warning time for the onset of epilepsy.

2020 ◽  
Author(s):  
Yener Turen ◽  
Dogan Ugur Sanli

<p>In this study, we assess the accuracy of deformation rates produced from GNSS campaign measurements sampled in different frequencies. The ideal frequency of the sampling seems to be 1 measurement per month however it is usually found to be cumbersome. Alternatively the sampling was performed 3 measurements per year and time series analyses were carried out. We used the continuous GPS time series of JPL, NASA from a global network of the IGS to decimate the data down to 4 monthly synthetic GNSS campaign time series. Minimum data period was taken to be 4 years following the suggestions from the literature. Furthermore, the effect of antenna set-up errors in campaign measurements on the estimated trend was taken into account. The accuracy of deformation rates were then determined taking the site velocities from ITRF14 solution as the truth. The RMS of monthly velocities agreed pretty well with the white noise error from global studies given previously in the literature. The RMS of four monthly deformation rates for horizontal positioning were obtained to be 0.45 and 0.50 mm/yr for north and east components respectively whereas the accuracy of vertical deformation rates was found to be 1.73 mm/yr. This is slightly greater than the average level of the white noise error from a global solution previously produced, in which antenna set up errors were out of consideration. Antenna set up errors in campaign measurements modified the above error level to 0.75 and 0.70 mm/yr for the horizontal components north and east respectively whereas the accuracy of the vertical component was slightly shifted to 1.79 mm/yr.</p>


2012 ◽  
Vol 8 (1) ◽  
pp. 89-115 ◽  
Author(s):  
V. K. C. Venema ◽  
O. Mestre ◽  
E. Aguilar ◽  
I. Auer ◽  
J. A. Guijarro ◽  
...  

Abstract. The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.


1998 ◽  
Vol 28 (1) ◽  
pp. 77-93 ◽  
Author(s):  
Terence Chan

AbstractThis paper presents a continuous time version of a stochastic investment model originally due to Wilkie. The model is constructed via stochastic differential equations. Explicit distributions are obtained in the case where the SDEs are driven by Brownian motion, which is the continuous time analogue of the time series with white noise residuals considered by Wilkie. In addition, the cases where the driving “noise” are stable processes and Gamma processes are considered.


2017 ◽  
Author(s):  
Miao Jing ◽  
Falk Heße ◽  
Wenqing Wang ◽  
Thomas Fischer ◽  
Marc Walther ◽  
...  

Abstract. Most of the current large scale hydrological models do not contain a physically-based groundwater flow component. The main difficulties in large-scale groundwater modeling include the efficient representation of unsaturated zone flow, the characterization of dynamic groundwater-surface water interaction and the numerical stability while preserving complex physical processes and high resolution. To address these problems, we propose a highly-scalable coupled hydrologic and groundwater model (mHM#OGS) based on the integration of two open-source modeling codes: the mesoscale hydrologic Model (mHM) and the finite element simulator OpenGeoSys (OGS). mHM#OGS is coupled using a boundary condition-based coupling scheme that dynamically links the surface and subsurface parts. Nested time stepping allows smaller time steps for typically faster surface runoff routing in mHM and larger time steps for slower subsurface flow in OGS. mHM#OGS features the coupling interface which can transfer the groundwater recharge and river baseflow rate between mHM and OpenGeoSys. Verification of the coupled model was conducted using the time-series of observed streamflow and groundwater levels. Moreover, we force the transient model using groundwater recharge in two scenarios: (1) spatially variable recharge based on the mHM simulations, and (2) spatially homogeneous groundwater recharge. The modeling result in first scenario has a slightly higher correlation with groundwater head time-series, which further validates the plausibility of spatial groundwater recharge distribution calculated by mHM in the mesocale. The statistical analysis of model predictions shows a promising prediction ability of the model. The offline coupling method implemented here can reproduce reasonable groundwater head time series while keep a desired level of detail in the subsurface model structure with little surplus in computational cost. Our exemplary calculations show that the coupled model mHM#OGS can be a valuable tool to assess the effects of variability in land surface heterogeneity, meteorological, topographical forces and geological zonation on the groundwater flow dynamics.


2001 ◽  
Vol 38 (A) ◽  
pp. 105-121
Author(s):  
Robert B. Davies

A time-series consisting of white noise plus Brownian motion sampled at equal intervals of time is exactly orthogonalized by a discrete cosine transform (DCT-II). This paper explores the properties of a version of spectral analysis based on the discrete cosine transform and its use in distinguishing between a stationary time-series and an integrated (unit root) time-series.


NeuroImage ◽  
2007 ◽  
Vol 36 (2) ◽  
pp. 282-288 ◽  
Author(s):  
Andrew T. Smith ◽  
Krishna D. Singh ◽  
Joshua H. Balsters

Sign in / Sign up

Export Citation Format

Share Document