scholarly journals Dynamic Complexity Measures: Definition and Calculation

Author(s):  
José Roberto C. Piqueira

This work is a generalization of the Lopez-Ruiz, Mancini and Calbet (LMC); and Shiner, Davison and Landsberg (SDL) complexity measures, considering that the state of a system or process is represented by a dynamical variable during a certain time interval. As the two complexity measures are based on the calculation of informational entropy, an equivalent information source is defined and, as time passes, the individual information associated to the measured parameter is the seed to calculate instantaneous LMC and SDL measures. To show how the methodology works, an example with economic data is presented.

Complexity ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-8
Author(s):  
José Roberto C. Piqueira ◽  
Sérgio Henrique Vannucchi Leme de Mattos

This work is a generalization of the López-Ruiz, Mancini, and Calbet (LMC) and Shiner, Davison, and Landsberg (SDL) complexity measures, considering that the state of a system or process is represented by a continuous temporal series of a dynamical variable. As the two complexity measures are based on the calculation of informational entropy, an equivalent information source is defined by using partitions of the dynamical variable range. During the time intervals, the information associated with the measured dynamical variable is the seed to calculate instantaneous LMC and SDL measures. To show how the methodology works generating indicators, two examples, one concerning meteorological data and the other concerning economic data, are presented and discussed.


2019 ◽  
Vol 70 (6) ◽  
pp. 2105-2107
Author(s):  
Gheorghita Popa ◽  
Olimpiu L. Karancsi ◽  
Maria Alexandra Preda ◽  
Marius Cristian Suta ◽  
Lavinia Stelea ◽  
...  

Our study aimed to determine pain levels and the state of welfare connected to laser-based procedures in the treatment of patients diagnosed with uncontrolled glaucoma. The study group included 100 eyes of 100 patients diagnosed with glucoma, 50 of them being treated with micropulse transscleral laser cyclophotocoagulation, and the other 50 eyes being treated with continuous transscleral laser cyclophotocoagulation. We used visual analog scale to gather information from each patient. After analysing the individual information the following results were obtained: the pain level for the micropulse transscleral laser cyclophotocoagulation was 60.23 mm, signifying moderate pain; and the pain score for the continuous transscleral laser cyclophotocoagulation was 76.34 mm, corresponding to moderate-intense pain. Pain level generated by minimally invasive laser procedures is discussed.


2017 ◽  
Vol 11 (4) ◽  
pp. 2018-2027 ◽  
Author(s):  
Jonathan Fischi ◽  
Roshanak Nilchiani ◽  
Jon Wade

Geophysics ◽  
1965 ◽  
Vol 30 (3) ◽  
pp. 363-368 ◽  
Author(s):  
T. W. Spencer

The formal solution for an axially symmetric radiation field in a multilayered, elastic system can be expanded in an infinite series. Each term in the series is associated with a particular raypath. It is shown that in the long‐time limit the individual response functions produced by a step input in particle velocity are given by polynomials in odd powers of the time. For rays which suffer m reflections, the degree of the polynomials is 2m+1. The total response is obtained by summing all rays which contribute in a specified time interval. When the rays are selected indiscriminately, the difference between the magnitude of the partial sum at an intermediate stage of computation and the magnitude of the correct total sum may be greater than the number of significant figures carried by the computer. A prescription is stated for arranging the rays into groups. Each group response function varies linearly in the long‐time limit and goes to zero when convolved with a physically realizable source function.


1988 ◽  
Vol 32 (15) ◽  
pp. 985-989 ◽  
Author(s):  
T. Mihaly ◽  
P.A. Hancock ◽  
M. Vercruyssen ◽  
M. Rahimi

An experiment is reported which evaluated performance on a 10-sec time interval estimation task before, during and after physical work on cycle ergometer at intensities of 30 and 60% VO2max, as scaled to the individual subject. Results from the eleven subjects tested indicate a significant increase in variability of estimates during exercise compared to non-exercise phases. Such a trend was also seen in the mean of estimates, where subjects significantly underestimated the target interval (10 seconds) during exercise. Subjects also performed more accurately with information feedback than without knowledge of results, but they were still not able to overcome the effects of exercise. As suggested by the experimental findings, decreased estimation accuracy and increased variability can be expected during physical work and is part of a body of evidence which indicates that exercise and its severity has a substantive impact on perceptual and cognitive performance.


Author(s):  
Margarita Martínez-Díaz ◽  
Ignacio Pérez Pérez

Most algorithms trying to analyze or forecast road traffic rely on many inputs, but in practice, calculations are usually limited by the available data and measurement equipment. Generally, some of these inputs are substituted by raw or even inappropriate estimations, which in some cases come into conflict with the fundamentals of traffic flow theory. This paper refers to one common example of these bad practices. Many traffic management centres depend on the data provided by double loop detectors, which supply, among others, vehicle speeds. The common data treatment is to compute the arithmetic mean of these speeds over different aggregation periods (i.e. the time mean speeds). Time mean speed is not consistent with Edie’s generalized definitions of traffic variables, and therefore it is not the average speed which relates flow to density. This means that current practice begins with an error that can have negative effects in later studies and applications. The algorithm introduced in this paper enables easily the estimation of space mean speeds from the data provided by the loops. It is based on two key hypotheses: stationarity of traffic and log-normal distribution of the individual speeds in each time interval of aggregation. It could also be used in case of transient traffic as a part of any data fusion methodology.DOI: http://dx.doi.org/10.4995/CIT2016.2016.3208 


Author(s):  
A. B. Sulin ◽  
◽  
A. A. Nikitin ◽  
T. V. Ryabova ◽  
S. S. Muraveinikov ◽  
...  

A method for controlling the ventilation system flow characteristics is considered based on the forming principle an air temperature and carbon dioxide concentration predicted estimate in a room based on the changes dynamics analysis in these parameters in the supply and exhaust ducts. The expected microclimate parameters predicted assessment in real time opens up the possibility of using such elements and algorithms for controlling the ventilation and air conditioning system, which provide the required air quality with minimal energy consumption. The analysis calculates the finding probability the measured parameter inside or outside the control zone after a specified time interval. The algorithm for the control system actuators actuation for the channels of temperature and carbon dioxide concentration is presented in the block diagram form. The decision-making logic for actuating the actuators is based on the changes direction and intensity analysis in temperature and carbon dioxide concentration in the exhaust duct and the temperature difference between the supply and exhaust


Sign in / Sign up

Export Citation Format

Share Document