Statistical Complexity Analysis of Spatiotemporal Dynamics

Author(s):  
José M. Angulo ◽  
Francisco J. Esquivel
Entropy ◽  
2013 ◽  
Vol 15 (12) ◽  
pp. 4084-4104 ◽  
Author(s):  
Moyocoyani Molina-Espíritu ◽  
Rodolfo Esquivel ◽  
Juan Angulo ◽  
Jesús Dehesa

2015 ◽  
Vol 14 (04) ◽  
pp. 1550040 ◽  
Author(s):  
Qingju Fan ◽  
Dan Li

In this study, we investigate the subtle temporal dynamics of California 1999–2000 spot price series based on permutation min-entropy (PME) and complexity-entropy causality plane. The dynamical transitions of price series are captured and the temporal correlations of price series are also discriminated by the recently introduced PME. Moreover, utilizing the CECP, we provide a refined classification of the monthly price dynamics and obtain an insight into the stochastic nature of price series. The results uncover that the spot price signal presents diverse temporal correlations and exhibits a higher stochastic behavior during the periods of crisis.


2014 ◽  
Vol 529 ◽  
pp. 675-678
Author(s):  
Zheng Xia Zhang ◽  
Si Qiu Xu ◽  
Er Ning Zhou ◽  
Xiao Lin Huang ◽  
Jun Wang

The article adopted the multiscale Jensen-Shannon Divergence analysis method for EEG complexity analysis. Then the study found that this method can distinguish between three different status (Eyes closed, count, in a daze) acquisition of EEG time series. It showed that three different states of EEG time series have significant differences. In each state of the three different states (Eyes closed, count, in a daze), we aimed at comparing and analyzing the statistical complexity of EEG time series itself and the statistical complexity of EEG time series shuffled data. It was found that there are large amounts of nonlinear time series in the EEG signals. This method is also fully proved that the multiscale JSD algorithm can be used to analyze attention EEG signals. The multiscale Jensen-Shannon Divergence statistical complexity can be used as a measure of brain function parameter, which can be applied to the auxiliary clinical brain function evaluation in the future.


Entropy ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. 105
Author(s):  
Jorge M. Silva ◽  
Eduardo Pinho ◽  
Sérgio Matos ◽  
Diogo Pratas

Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming independence. In the case of Turing machines, this means that machines with the same algorithmic complexity can create tapes with different statistical complexity. In this paper, we use a compression-based approach to measure global and local statistical complexity of specific Turing machine tapes with the same number of states and alphabet. Both measures are estimated using the best-order Markov model. For the global measure, we use the Normalized Compression (NC), while, for the local measures, we define and use normal and dynamic complexity profiles to quantify and localize lower and higher regions of statistical complexity. We assessed the validity of our methodology on synthetic and real genomic data showing that it is tolerant to increasing rates of editions and block permutations. Regarding the analysis of the tapes, we localize patterns of higher statistical complexity in two regions, for a different number of machine states. We show that these patterns are generated by a decrease of the tape’s amplitude, given the setting of small rule cycles. Additionally, we performed a comparison with a measure that uses both algorithmic and statistical approaches (BDM) for analysis of the tapes. Naturally, BDM is efficient given the algorithmic nature of the tapes. However, for a higher number of states, BDM is progressively approximated by our methodology. Finally, we provide a simple algorithm to increase the statistical complexity of a Turing machine tape while retaining the same algorithmic complexity. We supply a publicly available implementation of the algorithm in C++ language under the GPLv3 license. All results can be reproduced in full with scripts provided at the repository.


2014 ◽  
Vol 884-885 ◽  
pp. 512-515
Author(s):  
Zheng Xia Zhang ◽  
Si Qiu Xu ◽  
Er Ning Zhou ◽  
Xiao Lin Huang ◽  
Jun Wang

The article adopted the Jensen - Shannon Divergence analysis method for alpha wave EEG complexity analysis, used to quantify the three different status (Eyes closed, count, idle) degree of coupling between acquisition of EEG time series. The algorithm are used to calculate the statistical complexity of alpha wave EEG signals then T test, the results show that the state of eyes closed and idle under the coupling degree between EEG time series, and the state of eyes closed and counting, counting and daze cases EEG time series have significant differences. Thus JSD algorithm can be used to analyze EEG signals attention, statistical complexity can be used as a measure of brain function parameters and would be applied to the auxiliary clinical brain function evaluation in the future.


2003 ◽  
Vol 311 (2-3) ◽  
pp. 180-191 ◽  
Author(s):  
A.M Kowalski ◽  
M.T Martin ◽  
A Plastino ◽  
A.N Proto ◽  
O.A Rosso

2014 ◽  
Vol 574 ◽  
pp. 723-727
Author(s):  
Zheng Xia Zhang ◽  
Si Qiu Xu ◽  
Er Ning Zhou ◽  
Xiao Lin Huang ◽  
Jun Wang

The article adopted the multiscale Jensen - Shannon Divergence analysis method for EEG complexity analysis, then the study found that this method can distinguish between three different status (Eyes closed, count, in a daze) acquisition of Beta EEG time series, shows three different states of Beta EEG time series have significant differences. In each state of the three different states (Eyes closed, count, in a daze),we are aimed at comparing and analyzing the statistical complexity of EEG time series itself and the statistical complexity of EEG time series shuffled data, finding that there are large amounts of nonlinear time series in the Beta EEG signals. This method is also fully proved that the multi-scale JSD algorithm can be used to analyze EEG signals, attention statistical complexity can be used as a measure of brain function parameter, which can be applied to the auxiliary clinical brain function evaluation in the future.


2016 ◽  
Vol 540 ◽  
pp. 1136-1145 ◽  
Author(s):  
Tatijana Stosic ◽  
Luciano Telesca ◽  
Diego Vicente de Souza Ferreira ◽  
Borko Stosic

Entropy ◽  
2019 ◽  
Vol 21 (12) ◽  
pp. 1220 ◽  
Author(s):  
Fernando Henrique Antunes de Araujo ◽  
Lucian Bejan ◽  
Osvaldo A. Rosso ◽  
Tatijana Stosic

Agricultural commodities are considered perhaps the most important commodities, as any abrupt increase in food prices has serious consequences on food security and welfare, especially in developing countries. In this work, we analyze predictability of Brazilian agricultural commodity prices during the period after 2007/2008 food crisis. We use information theory based method Complexity/Entropy causality plane (CECP) that was shown to be successful in the analysis of market efficiency and predictability. By estimating information quantifiers permutation entropy and statistical complexity, we associate to each commodity the position in CECP and compare their efficiency (lack of predictability) using the deviation from a random process. Coffee market shows highest efficiency (lowest predictability) while pork market shows lowest efficiency (highest predictability). By analyzing temporal evolution of commodities in the complexity–entropy causality plane, we observe that during the analyzed period (after 2007/2008 crisis) the efficiency of cotton, rice, and cattle markets increases, the soybeans market shows the decrease in efficiency until 2012, followed by the lower predictability and the increase of efficiency, while most commodities (8 out of total 12) exhibit relatively stable efficiency, indicating increased market integration in post-crisis period.


Sign in / Sign up

Export Citation Format

Share Document