scholarly journals Quantifying Statistical Interdependence, Part III: N > 2 Point Processes

2012 ◽  
Vol 24 (2) ◽  
pp. 408-454 ◽  
Author(s):  
Justin Dauwels ◽  
Theophane Weber ◽  
François Vialatte ◽  
Toshimitsu Musha ◽  
Andrzej Cichocki

Stochastic event synchrony (SES) is a recently proposed family of similarity measures. First, “events” are extracted from the given signals; next, one tries to align events across the different time series. The better the alignment, the more similar the N time series are considered to be. The similarity measures quantify the reliability of the events (the fraction of “nonaligned” events) and the timing precision. So far, SES has been developed for pairs of one-dimensional (Part I) and multidimensional (Part II) point processes. In this letter (Part III), SES is extended from pairs of signals to N > 2 signals. The alignment and SES parameters are again determined through statistical inference, more specifically, by alternating two steps: (1) estimating the SES parameters from a given alignment and (2), with the resulting estimates, refining the alignment. The SES parameters are computed by maximum a posteriori (MAP) estimation (step 1), in analogy to the pairwise case. The alignment (step 2) is solved by linear integer programming. In order to test the robustness and reliability of the proposed N-variate SES method, it is first applied to synthetic data. We show that N-variate SES results in more reliable estimates than bivariate SES. Next N-variate SES is applied to two problems in neuroscience: to quantify the firing reliability of Morris-Lecar neurons and to detect anomalies in EEG synchrony of patients with mild cognitive impairment. Those problems were also considered in Parts I and II, respectively. In both cases, the N-variate SES approach yields a more detailed analysis.

2009 ◽  
Vol 21 (8) ◽  
pp. 2152-2202 ◽  
Author(s):  
J. Dauwels ◽  
F. Vialatte ◽  
T. Weber ◽  
A. Cichocki

We present a novel approach to quantify the statistical interdependence of two time series, referred to as stochastic event synchrony (SES). The first step is to extract “events” from the two given time series. The next step is to try to align events from one time series with events from the other. The better the alignment, the more similar the two time series are considered to be. More precisely, the similarity is quantified by the following parameters: time delay, variance of the timing jitter, fraction of noncoincident events, and average similarity of the aligned events. The pairwise alignment and SES parameters are determined by statistical inference. In particular, the SES parameters are computed by maximum a posteriori (MAP) estimation, and the pairwise alignment is obtained by applying the max-product algorithm. This letter deals with one-dimensional point processes; the extension to multidimensional point processes is considered in a companion letter in this issue. By analyzing surrogate data, we demonstrate that SES is able to quantify both timing precision and event reliability more robustly than classical measures can. As an illustration, neuronal spike data generated by the Morris-Lecar neuron model are considered.


2009 ◽  
Vol 21 (8) ◽  
pp. 2203-2268 ◽  
Author(s):  
J. Dauwels ◽  
F. Vialatte ◽  
T. Weber ◽  
T. Musha ◽  
A. Cichocki

Stochastic event synchrony is a technique to quantify the similarity of pairs of signals. First, events are extracted from the two given time series. Next, one tries to align events from one time series with events from the other. The better the alignment, the more similar the two time series are considered to be. In Part I, the companion letter in this issue, one-dimensional events are considered; this letter concerns multidimensional events. Although the basic idea is similar, the extension to multidimensional point processes involves a significantly more difficult combinatorial problem and therefore is nontrivial. Also in the multidimensional case, the problem of jointly computing the pairwise alignment and SES parameters is cast as a statistical inference problem. This problem is solved by coordinate descent, more specifically, by alternating the following two steps: (1) estimate the SES parameters from a given pairwise alignment; (2) with the resulting estimates, refine the pairwise alignment. The SES parameters are computed by maximum a posteriori (MAP) estimation (step 1), in analogy to the one-dimensional case. The pairwise alignment (step 2) can no longer be obtained through dynamic programming, since the state space becomes too large. Instead it is determined by applying the max-product algorithm on a cyclic graphical model. In order to test the robustness and reliability of the SES method, it is first applied to surrogate data. Next, it is applied to detect anomalies in EEG synchrony of mild cognitive impairment (MCI) patients. Numerical results suggest that SES is significantly more sensitive to perturbations in EEG synchrony than a large variety of classical synchrony measures.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Els Weinans ◽  
Rick Quax ◽  
Egbert H. van Nes ◽  
Ingrid A. van de Leemput

AbstractVarious complex systems, such as the climate, ecosystems, and physical and mental health can show large shifts in response to small changes in their environment. These ‘tipping points’ are notoriously hard to predict based on trends. However, in the past 20 years several indicators pointing to a loss of resilience have been developed. These indicators use fluctuations in time series to detect critical slowing down preceding a tipping point. Most of the existing indicators are based on models of one-dimensional systems. However, complex systems generally consist of multiple interacting entities. Moreover, because of technological developments and wearables, multivariate time series are becoming increasingly available in different fields of science. In order to apply the framework of resilience indicators to multivariate time series, various extensions have been proposed. Not all multivariate indicators have been tested for the same types of systems and therefore a systematic comparison between the methods is lacking. Here, we evaluate the performance of the different multivariate indicators of resilience loss in different scenarios. We show that there is not one method outperforming the others. Instead, which method is best to use depends on the type of scenario the system is subject to. We propose a set of guidelines to help future users choose which multivariate indicator of resilience is best to use for their particular system.


Author(s):  
Ruqiang Yan ◽  
Robert X. Gao ◽  
Kang B. Lee ◽  
Steven E. Fick

This paper presents a noise reduction technique for vibration signal analysis in rolling bearings, based on local geometric projection (LGP). LGP is a non-linear filtering technique that reconstructs one dimensional time series in a high-dimensional phase space using time-delayed coordinates, based on the Takens embedding theorem. From the neighborhood of each point in the phase space, where a neighbor is defined as a local subspace of the whole phase space, the best subspace to which the point will be orthogonally projected is identified. Since the signal subspace is formed by the most significant eigen-directions of the neighborhood, while the less significant ones define the noise subspace, the noise can be reduced by converting the points onto the subspace spanned by those significant eigen-directions back to a new, one-dimensional time series. Improvement on signal-to-noise ratio enabled by LGP is first evaluated using a chaotic system and an analytically formulated synthetic signal. Then analysis of bearing vibration signals is carried out as a case study. The LGP-based technique is shown to be effective in reducing noise and enhancing extraction of weak, defect-related features, as manifested by the multifractal spectrum from the signal.


2019 ◽  
Vol 88 ◽  
pp. 506-517 ◽  
Author(s):  
Izaskun Oregi ◽  
Aritz Pérez ◽  
Javier Del Ser ◽  
Jose A. Lozano

Author(s):  
Shibnath Mukherjee ◽  
Aryya Gangopadhyay ◽  
Zhiyuan Chen

While data mining has been widely acclaimed as a technology that can bring potential benefits to organizations, such efforts may be negatively impacted by the possibility of discovering sensitive patterns, particularly in patient data. In this article the authors present an approach to identify the optimal set of transactions that, if sanitized, would result in hiding sensitive patterns while reducing the accidental hiding of legitimate patterns and the damage done to the database as much as possible. Their methodology allows the user to adjust their preference on the weights assigned to benefits in terms of the number of restrictive patterns hidden, cost in terms of the number of legitimate patterns hidden, and damage to the database in terms of the difference between marginal frequencies of items for the original and sanitized databases. Most approaches in solving the given problem found in literature are all-heuristic based without formal treatment for optimality. While in a few work, ILP has been used previously as a formal optimization approach, the novelty of this method is the extremely low cost-complexity model in contrast to the others. They implement our methodology in C and C++ and ran several experiments with synthetic data generated with the IBM synthetic data generator. The experiments show excellent results when compared to those in the literature.


Author(s):  
Yuliya Tanasyuk ◽  
Petro Burdeinyi

The given paper is devoted to the software development of block cipher based on reversible one-dimensional cellular automata and the study of its statistical properties. The software implementation of the proposed encryption algorithm is performed in C# programming language in Visual Studio 2017. The paper presents specially designed approach for key generation. To ensure desired cryptographic stability, the shared secret parameters can be adjusted to contain information needed for creating substitution tables, defining reversible rules, and hiding final data. For the first time, it is suggested to create substitution tables based on iterations of a cellular automaton that is initialized by the key data.


2005 ◽  
Vol 97 (1) ◽  
pp. 309-320 ◽  
Author(s):  
Martin E. Arendasy ◽  
Andreas Hergovich ◽  
Markus Sommer ◽  
Bettina Bognar

The study at hand reports first results about the dimensionality and construct validity of a newly developed objective, video-based personality test, which assesses the willingness to take risks in traffic situations. On the basis of the theory of risk homeostasis developed by Wilde, different traffic situations with varying amounts of objective danger were filmed. These situations mainly consisted of situations with passing maneuvers and speed choice or traffic situations at intersections. Each of these traffic situations describes an action which should be carried out. The videos of the traffic situations are presented twice. Before the first presentation, a short written explanation of the preceding traffic situation and a situation-contingent reaction is provided. The respondents are allowed to obtain an overview of the given situations during the first presentation of each traffic situation. During the second presentation the respondents are asked to indicate at which point the action that is contingent on the described situation will become too dangerous to carry out. Latencies for items were recorded as a measure for the magnitude of the person's subjectively accepted willingness to take risks in the sense of the risk homeostasis theory by Wilde. In a study with 243 people with different education and sex, the one-dimensionality of the test corresponding to the latency model by Scheiblechner was investigated. Analysis indicated that the new measure assesses a one-dimensional latent personality trait which can be interpreted as subjectively accepted amount of risk (target risk value). First indicators for the construct validity of the test are given by a significant correlation with the construct-related secondary scale, adventurousness of the Eysenck Personality Profiler with, at the same time, nonsignificant correlations to the two secondary scales, extroversion and emotional stability, that are not linked to the construct.


Sign in / Sign up

Export Citation Format

Share Document