large complex system
Recently Published Documents


TOTAL DOCUMENTS

34
(FIVE YEARS 7)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Vol 2021 (12) ◽  
pp. 123301
Author(s):  
Pierre Mergny ◽  
Satya N Majumdar

Abstract We study the probability of stability of a large complex system of size N within the framework of a generalized May model, which assumes a linear dynamics of each population size n i (with respect to its equilibrium value): d n i d t = − a i n i − T ∑ j J i j n j . The a i > 0’s are the intrinsic decay rates, J ij is a real symmetric (N × N) Gaussian random matrix and T measures the strength of pairwise interaction between different species. Our goal is to study how inhomogeneities in the intrinsic damping rates a i affect the stability of this dynamical system. As the interaction strength T increases, the system undergoes a phase transition from a stable phase to an unstable phase at a critical value T = T c. We reinterpret the probability of stability in terms of the hitting time of the level b = 0 of an associated Dyson Brownian motion (DBM), starting at the initial position a i and evolving in ‘time’ T. In the large N → ∞ limit, using this DBM picture, we are able to completely characterize T c for arbitrary density μ(a) of the a i ’s. For a specific flat configuration a i = 1 + σ i − 1 N , we obtain an explicit parametric solution for the limiting (as N → ∞) spectral density for arbitrary T and σ. For finite but large N, we also compute the large deviation properties of the probability of stability on the stable side T < T c using a Coulomb gas representation.


2021 ◽  
Vol 2087 (1) ◽  
pp. 012083
Author(s):  
FengKai Lin ◽  
YanRong Li ◽  
YaWei Liu ◽  
LingXiao Chen

Abstract As a large, complex system with wide distribution and high real-time, the safe, stable and reliable operation of the power system is inseparable from the power automation system. In this paper, an in-depth study is conducted on the key issues of this solution in the process of software cross-platform integration around the EPRI graphic center method, combined with the power data interface standard. The dispatch center graphics system and the dispatch center power graphics application method are discussed in detail in the paper. In addition, the paper also explains the online calibration system for relay protection ratings with smart grid dispatching technology support system. The methods and sample systems in this paper have been practically applied in large scale power grid dispatching centers, providing a reliable guarantee for the safe and stable operation of the power grid.


2020 ◽  
Author(s):  
Baruch Barzel ◽  
Chandrakala Meena ◽  
Chittaranjan Hens ◽  
Simi Haber ◽  
Boccaletti Stefano

Abstract Will a large complex system be stable? This question, first posed by May in 1972, captures a long standing challenge, fueled by a seeming contradiction between theory and practice. While empirical reality answers with an astounding yes, the mathematical analysis, based on linear stability theory, seems to suggest the contrary - hence, the diversity-stability paradox. Here we settle this dichotomy, by considering the interplay between topology and dynamics. We show that this interplay leads to the emergence of non-random patterns in the system's stability matrix, leading us to relinquish the prevailing random matrix-based paradigm. Instead, we offer a new matrix ensemble, which captures the dynamic stability of real-world systems. This ensemble helps us analytically identify the relevant control parameters that predict a system's stability, exposing three broad dynamic classes: In the asymptotically unstable class, diversity, indeed, leads to instability a la May's paradox. However, we also expose an asymptotically stable class, the class in which most real systems reside, in which diversity not only does not prohibit, but, in fact, enhances dynamic stability. Finally, in the sensitively stable class diversity plays no role, and hence stability is driven by the system's microscopic parameters. Together, our theory uncovers the naturally emerging rules of complex system stability, helping us reconcile the paradox that has eluded us for decades.


2020 ◽  
Vol 8 (3) ◽  
pp. 201 ◽  
Author(s):  
Irena Jurdana ◽  
Artem Krylov ◽  
Julia Yamnenko

The purpose of this article is to propose a solution for the transport problem in sea freight using machine learning algorithms. An important aspect of sea transport is the organization of freight. In particular, the maritime freight network is a large complex system whose complexity of route maps and the variety of ship traffic render it difficult to model. When investigating the characteristics of the sea freight system, it is generally advisable to use rough models in which only significant approximations are introduced and a number of details are not taken into account. At the same time, an exact model is used in a detailed study of isolated areas of the network wherein it is the area which is explored in detail and not the connections between the said areas. By so doing, one should be careful not to overlook the deviations of the model from the real network in the first case and the connections between areas in the second.Building a model that accurately takes into account and describes all the details results in excessive complications in the design process, so, in practice, a number of assumptions are always used in the simulation which are basically approximations of the real characteristics related to ship movement, depending on the specific task. Four models are used in order to build an optimal cargo transportation system: Transnational cargo model; model of cargo transportation with a dedicated initial port of cargo departure; model of cargo transportation with dedicated initial ports of departure and final port of cargo distribution; model of cargo transportation on a circular chain of ports. The route conditions are given by the traveling wave equation and on the basis of these calculations the optimal route of cargo ship movement is put forth whereby conditions affecting freight traffic include: Number of ports, fuel quantity, port of cargo destination, as well as distances between ports and intermediate ports of call. The scientific contribution lies in the fact that the human role is reduced only to that of the system observer, which, in turn, simplifies the freight calculations, as well as helps reduce the cost of fuel and human resources.


2019 ◽  
Author(s):  
Sarah I. Allec ◽  
Yijing Sun ◽  
Jianan Sun ◽  
Chia-En A. Chang ◽  
Bryan Wong

We introduce a new heterogeneous CPU+GPU-enhanced DFTB approach for the routine and efficient simulation of large chemical and biological systems. Compared to homogenous computing with conventional CPUs, heterogeneous computing approaches exhibit substantial performance with only a modest increase in power consumption, both of which are essential to upcoming exascale computing initiatives. We show that DFTB-based molecular dynamics is a natural candidate for heterogeneous computing since the computational bottleneck in these simulations is the diagonalization of the Hamiltonian matrix, which is performed several times during a single molecular dynamics trajectory. To thoroughly test and understand the performance of our heterogeneous CPU+GPU approach, we examine a variety of algorithmic implementations, benchmarks of different hardware configurations, and applications of this methodology on several large chemical and biological systems. Finally, to demonstrate the capability of our implementation, we conclude with a large-scale DFTB MD simulation of explicitly solvated HIV protease (3,974 atoms total) as a proof-of-concept example of an extremely large/complex system which, to the best of our knowledge, is the first time that an entire explicitly-solvated protein has been treated at a quantum-based MD level of detail.


2019 ◽  
Author(s):  
Sarah I. Allec ◽  
Yijing Sun ◽  
Jianan Sun ◽  
Chia-En A. Chang ◽  
Bryan Wong

We introduce a new heterogeneous CPU+GPU-enhanced DFTB approach for the routine and efficient simulation of large chemical and biological systems. Compared to homogenous computing with conventional CPUs, heterogeneous computing approaches exhibit substantial performance with only a modest increase in power consumption, both of which are essential to upcoming exascale computing initiatives. We show that DFTB-based molecular dynamics is a natural candidate for heterogeneous computing since the computational bottleneck in these simulations is the diagonalization of the Hamiltonian matrix, which is performed several times during a single molecular dynamics trajectory. To thoroughly test and understand the performance of our heterogeneous CPU+GPU approach, we examine a variety of algorithmic implementations, benchmarks of different hardware configurations, and applications of this methodology on several large chemical and biological systems. Finally, to demonstrate the capability of our implementation, we conclude with a large-scale DFTB MD simulation of explicitly solvated HIV protease (3,974 atoms total) as a proof-of-concept example of an extremely large/complex system which, to the best of our knowledge, is the first time that an entire explicitly-solvated protein has been treated at a quantum-based MD level of detail.


2019 ◽  
Vol 301 ◽  
pp. 00007
Author(s):  
Joseph T. Foley ◽  
Lindy Puik ◽  
Erik Puik ◽  
Joseph Smith ◽  
David S. Cochran

Axiomatic Design and Complexity theory are often applied to highly complex and technological systems which provide educators with many engineering examples and case studies. The use of Axiomatic Design is applicable outside of these areas. However, there are not many examples outside of these areas. As a result, students often have trouble understanding the breadth and impact of Axiomatic Design’s application to problem-solving. One large complex system that is often overlooked is that of the kitchen. In this paper, we present different food-related preparation tasks that are inherently complex: cooking a turkey, baking an apple pie, reverse-engineering a recipe, and designing ecologically-minded food packaging while also discussing the impact of prepared food’s packaging approaches on the environment. The authors believe such examples demonstrate Axiomatic Design’s applicability in a new aspect that is approachable to a wide audience.


Author(s):  
Simanti Bhattacharya ◽  
Angshuman Bagchi

Cellular Automata (CAs) are spatially arranged systems, composed of units or cells having distinct information and all are under uniform condition. This system gradually progresses with time evolution based on the local information following a specific rule and achieves a configuration that reflects the global scenario. It very efficiently mimics large complex system, segmented into local units and hence makes the calculations easy. The local changes are integrated in a synchronized manner to give the final outcome with the essence of robustness associated with simple and uniform functions applied throughout. This has made CA to be successfully applied in various fields of scientific researches. This book chapter describes the fundamental properties of CA and its structures. Then it advances to the simple rule construction and local to global configuration. Finally it discusses the vast applicability of CA in different segments of science, specifically in the field of biological researches.


Sign in / Sign up

Export Citation Format

Share Document