Big Data Solutions to Interpreting Complex Systems in the Environment

Author(s):  
Hongmei Chi ◽  
Sharmini Pitter ◽  
Nan Li ◽  
Haiyan Tian
Keyword(s):  
Big Data ◽  
2016 ◽  
Vol 12 (S325) ◽  
pp. 341-344 ◽  
Author(s):  
Sergi Blanco-Cuaresma ◽  
Emeline Bolmont

AbstractThe astrophysics community uses different tools for computational tasks such as complex systems simulations, radiative transfer calculations or big data. Programming languages like Fortran, C or C++ are commonly present in these tools and, generally, the language choice was made based on the need for performance. However, this comes at a cost: safety. For instance, a common source of error is the access to invalid memory regions, which produces random execution behaviors and affects the scientific interpretation of the results.In 2015, Mozilla Research released the first stable version of a new programming language named Rust. Many features make this new language attractive for the scientific community, it is open source and it guarantees memory safety while offering zero-cost abstraction.We explore the advantages and drawbacks of Rust for astrophysics by re-implementing the fundamental parts of Mercury-T, a Fortran code that simulates the dynamical and tidal evolution of multi-planet systems.


2020 ◽  
Vol 13 (7) ◽  
pp. 153 ◽  
Author(s):  
Paulo Ferreira ◽  
Éder J.A.L. Pereira ◽  
Hernane B.B. Pereira

Big data has become a very frequent research topic, due to the increase in data availability. In this introductory paper, we make the linkage between the use of big data and Econophysics, a research field which uses a large amount of data and deals with complex systems. Different approaches such as power laws and complex networks are discussed, as possible frameworks to analyze complex phenomena that could be studied using Econophysics and resorting to big data.


Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 297
Author(s):  
Haoyu Niu ◽  
YangQuan Chen ◽  
Bruce J. West

Fractional-order calculus is about the differentiation and integration of non-integer orders. Fractional calculus (FC) is based on fractional-order thinking (FOT) and has been shown to help us to understand complex systems better, improve the processing of complex signals, enhance the control of complex systems, increase the performance of optimization, and even extend the enabling of the potential for creativity. In this article, the authors discuss the fractional dynamics, FOT and rich fractional stochastic models. First, the use of fractional dynamics in big data analytics for quantifying big data variability stemming from the generation of complex systems is justified. Second, we show why fractional dynamics is needed in machine learning and optimal randomness when asking: “is there a more optimal way to optimize?”. Third, an optimal randomness case study for a stochastic configuration network (SCN) machine-learning method with heavy-tailed distributions is discussed. Finally, views on big data and (physics-informed) machine learning with fractional dynamics for future research are presented with concluding remarks.


DIALOGO ◽  
2021 ◽  
Vol 8 (1) ◽  
pp. 43-54
Author(s):  
Catalin Nutu

The paper is presenting different dynamics of an economic complex system, based on certain assumptions made within each of the presented models in the paper. The assumptions used are allowing estimating and forecasting the evolution of the respective complex economic system. This estimation can be made in the “classical” way, which is to say without the use of big data, in which case the results are more prone to errors, or with the use of the recordings of big data, which delivers a much more accurate estimation of the evolution within the complex systems. Throughout the paper the term “with sustainable growth” is used with the meaning of “ecological growth” and it refers to durable economic systems where the environmental issues are taken into consideration, whereas the term “without sustainable growth” or “conventional growth” refers to the economic systems where the environmental issues are not taken into account.


2015 ◽  
Vol 57 (4) ◽  
Author(s):  
Ingo Scholtes

AbstractBetter understanding and controlling complex systems has become a grand challenge not only for computer science, but also for the natural and social sciences. Many of these systems have in common that they can be studied from a network perspective. Consequently methods from network science have proven instrumental in their analysis. In this article, I introduce the macroscopic perspective that is at the heart of network science. Summarizing my recent research activities, I discuss how a combination of this perspective with Big Data methods can improve our understanding of complex systems.


2021 ◽  
Vol 1201 (1) ◽  
pp. 012088
Author(s):  
T Bankole-Oye ◽  
I El-Thalji ◽  
J Zec

Abstract Large companies are investing heavily in digitalization to be more competitive and economically viable. Hence, physical assets and maintenance operations have been digitally transformed to transmit a high volume of data, e.g., condition monitoring data. Such high-volume data can be useful to optimize maintenance operations and minimize maintenance and replacement costs. A tool to optimize maintenance using condition monitoring data is the Proportional hazard model (PHM). However, it is challenging to implement PHM for industrial complex systems that generate big data. Therefore, machine learning algorithms shall support PHM method to handle such a high volume of data. Thus, the purpose of this paper is to explore how to support PHM with Principal Component Analysis (PCA) to maintenance optimization of complex industrial systems. A case study of hydraulic power unit was purposefully selected to apply and validate the proposed analytical approach. The results show that PCA supported PHM optimizes and extends the preventive maintenance interval by 79.27% which might lead to maintenance cost reductions. This model enables PHM to handle complex systems where big data is collected.


2021 ◽  
Vol 30 (4) ◽  
pp. 2-5
Author(s):  
Young-Ho EOM

The data revolution of the 21st century provides us a huge amount of urban data. This featured article briefly surveys how physicists identify quantitative patterns from such data, make sense of those patterns by using simple models, and reveal the underlying mechanisms.


Author(s):  
Sauro Succi ◽  
Peter V. Coveney

For it is not the abundance of knowledge, but the interior feeling and taste of things, which is accustomed to satisfy the desire of the soul.(Saint Ignatius of Loyola).We argue that the boldest claims of big data (BD) are in need of revision and toning-down, in view of a few basic lessons learned from the science of complex systems. We point out that, once the most extravagant claims of BD are properly discarded, a synergistic merging of BD with big theory offers considerable potential to spawn a new scientific paradigm capable of overcoming some of the major barriers confronted by the modern scientific method originating with Galileo. These obstacles are due to the presence of nonlinearity, non-locality and hyperdimensions which one encounters frequently in multi-scale modelling of complex systems.This article is part of the theme issue ‘Multiscale modelling, simulation and computing: from the desktop to the exascale’.


Sign in / Sign up

Export Citation Format

Share Document