Dynamic Complexity Measures for Use in Complexity-Based System Design

2017 ◽  
Vol 11 (4) ◽  
pp. 2018-2027 ◽  
Author(s):  
Jonathan Fischi ◽  
Roshanak Nilchiani ◽  
Jon Wade
2015 ◽  
Vol 76 (11) ◽  
Author(s):  
Siti Mariam Abdul Rahman ◽  
Clark Borst ◽  
Max Mulder ◽  
Rene van Paassen

In developing a more advanced human-machine systems for future Air Traffic Management (ATM) concepts requires a deep understanding of what constitutes operator workload and how taskload and sector complexity can affect it. Many efforts have been done in the past to measure and/or predict operator workload using sector complexity. However, most sector complexity metrics that include sector design are calculated according to a set of rules and subjective weightings, rendering them to be dependent of sector. This research focuses on comparing the Solution Space Diagram (SSD) method with a widely accepted complexity metric: Dynamic Density (DD). In essence, the SSD method used in this research, observed aircraft restrictions and opportunities to resolve traffic conflicts in both the speed and heading dimensions. It is hypothesized that the more area covered on the solution space, that is, the fewer options the controller has to resolve conflicts, the more difficult the task and the higher the workload experienced by the controller. To compare sector complexity measures in terms of their transferability in capturing dynamic complexity across different sectors, a human-in-the-loop experiment using two distinct sectors has been designed and conducted. Based on the experiments, it is revealed that the SSD metric has a higher correlation with the controllers' workload ratings than the number of aircraft and the un-weighted NASA DD metric. Although linear regression analysis improved the correlation between the workload ratings and the weighted DD metric as compared to the SSD metric, the DD metric proved to be more sensitive to changes in sector layout than the SSD metric. This result would indicate that the SSD metric is better able to capture controller workload than the DD metric, when tuning for a specific sector layout is not feasible. 


Author(s):  
José Roberto C. Piqueira

This work is a generalization of the Lopez-Ruiz, Mancini and Calbet (LMC); and Shiner, Davison and Landsberg (SDL) complexity measures, considering that the state of a system or process is represented by a dynamical variable during a certain time interval. As the two complexity measures are based on the calculation of informational entropy, an equivalent information source is defined and, as time passes, the individual information associated to the measured parameter is the seed to calculate instantaneous LMC and SDL measures. To show how the methodology works, an example with economic data is presented.


2019 ◽  
Vol 34 (4) ◽  
pp. 663-674 ◽  
Author(s):  
Valluvan Rangasamy ◽  
Teresa S. Henriques ◽  
Pooja A. Mathur ◽  
Roger B. Davis ◽  
Murray A. Mittleman ◽  
...  

2010 ◽  
Vol 24 (2) ◽  
pp. 131-135 ◽  
Author(s):  
Włodzimierz Klonowski ◽  
Pawel Stepien ◽  
Robert Stepien

Over 20 years ago, Watt and Hameroff (1987 ) suggested that consciousness may be described as a manifestation of deterministic chaos in the brain/mind. To analyze EEG-signal complexity, we used Higuchi’s fractal dimension in time domain and symbolic analysis methods. Our results of analysis of EEG-signals under anesthesia, during physiological sleep, and during epileptic seizures lead to a conclusion similar to that of Watt and Hameroff: Brain activity, measured by complexity of the EEG-signal, diminishes (becomes less chaotic) when consciousness is being “switched off”. So, consciousness may be described as a manifestation of deterministic chaos in the brain/mind.


1993 ◽  
Vol 38 (1) ◽  
pp. 101-102
Author(s):  
Charles G. Halcomb
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document