Design and Analysis of Computer Experiments

Author(s):  
Xinwei Deng ◽  
Ying Hung ◽  
C. Devon Lin

Computer experiments refer to the study of complex systems using mathematical models and computer simulations. The use of computer experiments becomes popular for studying complex systems in science and engineering. The design and analysis of computer experiments have received broad attention in the past decades. In this chapter, we present several widely used statistical approaches for design and analysis of computer experiments, including space-filling designs and Gaussian process modeling. A special emphasis is given to recently developed design and modeling techniques for computer experiments with quantitative and qualitative factors.

2013 ◽  
Vol 10 (2) ◽  
pp. 115-124
Author(s):  
Philip L. Martin

Japan and the United States, the world’s largest economies for most of the past half century, have very different immigration policies. Japan is the G7 economy most closed to immigrants, while the United States is the large economy most open to immigrants. Both Japan and the United States are debating how immigrants are and can con-tribute to the competitiveness of their economies in the 21st centuries. The papers in this special issue review the employment of and impacts of immigrants in some of the key sectors of the Japanese and US economies, including agriculture, health care, science and engineering, and construction and manufacturing. For example, in Japanese agriculture migrant trainees are a fixed cost to farmers during the three years they are in Japan, while US farmers who hire mostly unauthorized migrants hire and lay off workers as needed, making labour a variable cost.


2020 ◽  
Vol 4 (8) ◽  
Author(s):  
Shion Takeno ◽  
Yuhki Tsukada ◽  
Hitoshi Fukuoka ◽  
Toshiyuki Koyama ◽  
Motoki Shiga ◽  
...  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Els Weinans ◽  
Rick Quax ◽  
Egbert H. van Nes ◽  
Ingrid A. van de Leemput

AbstractVarious complex systems, such as the climate, ecosystems, and physical and mental health can show large shifts in response to small changes in their environment. These ‘tipping points’ are notoriously hard to predict based on trends. However, in the past 20 years several indicators pointing to a loss of resilience have been developed. These indicators use fluctuations in time series to detect critical slowing down preceding a tipping point. Most of the existing indicators are based on models of one-dimensional systems. However, complex systems generally consist of multiple interacting entities. Moreover, because of technological developments and wearables, multivariate time series are becoming increasingly available in different fields of science. In order to apply the framework of resilience indicators to multivariate time series, various extensions have been proposed. Not all multivariate indicators have been tested for the same types of systems and therefore a systematic comparison between the methods is lacking. Here, we evaluate the performance of the different multivariate indicators of resilience loss in different scenarios. We show that there is not one method outperforming the others. Instead, which method is best to use depends on the type of scenario the system is subject to. We propose a set of guidelines to help future users choose which multivariate indicator of resilience is best to use for their particular system.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 1022
Author(s):  
Gianluca D’Addese ◽  
Martina Casari ◽  
Roberto Serra ◽  
Marco Villani

In many complex systems one observes the formation of medium-level structures, whose detection could allow a high-level description of the dynamical organization of the system itself, and thus to its better understanding. We have developed in the past a powerful method to achieve this goal, which however requires a heavy computational cost in several real-world cases. In this work we introduce a modified version of our approach, which reduces the computational burden. The design of the new algorithm allowed the realization of an original suite of methods able to work simultaneously at the micro level (that of the binary relationships of the single variables) and at meso level (the identification of dynamically relevant groups). We apply this suite to a particularly relevant case, in which we look for the dynamic organization of a gene regulatory network when it is subject to knock-outs. The approach combines information theory, graph analysis, and an iterated sieving algorithm in order to describe rather complex situations. Its application allowed to derive some general observations on the dynamical organization of gene regulatory networks, and to observe interesting characteristics in an experimental case.


Acta Numerica ◽  
2021 ◽  
Vol 30 ◽  
pp. 765-851
Author(s):  
Wei Wang ◽  
Lei Zhang ◽  
Pingwen Zhang

Liquid crystals are a type of soft matter that is intermediate between crystalline solids and isotropic fluids. The study of liquid crystals has made tremendous progress over the past four decades, which is of great importance for fundamental scientific research and has widespread applications in industry. In this paper we review the mathematical models and their connections to liquid crystals, and survey the developments of numerical methods for finding rich configurations of liquid crystals.


2009 ◽  
Vol 12 (3) ◽  
pp. 241-250 ◽  
Author(s):  
Petra Claeys ◽  
Ann van Griensven ◽  
Lorenzo Benedetti ◽  
Bernard De Baets ◽  
Peter A. Vanrolleghem

Mathematical models provide insight into numerous biological, physical and chemical systems. They can be used in process design, optimisation, control and decision support, as acknowledged in many different fields of scientific research. Mathematical models do not always yield reliable results and uncertainty should be taken into account. At present, it is possible to identify some factors contributing to uncertainty, and the awareness of the necessity of uncertainty assessment is rising. In the fields of Environmental Modelling and Computational Fluid Dynamics, for instance, terminology related to uncertainty exists and is generally accepted. However, the uncertainty due to the choice of the numerical solver and its settings used to compute the solution of the models did not receive much attention in the past. A motivating example on the existence and effect of numerical uncertainty is provided and clearly shows that we can no longer ignore it. This paper introduces a new terminology to support communication about uncertainty caused by numerical solvers, so that scientists become perceptive to it.


2012 ◽  
Vol 26 (29) ◽  
pp. 1230014 ◽  
Author(s):  
CHRISTOPHER C. BERNIDO ◽  
M. VICTORIA CARPIO-BERNIDO

The white noise calculus originated by T. Hida is presented as a powerful tool in investigating physical and social systems. Combined with Feynman's sum-over-all histories approach, we parameterize paths with memory of the past, and evaluate the corresponding probability density function. We discuss applications of this approach to problems in complex systems and biophysics. Examples in quantum mechanics with boundaries are also given where Markovian paths are considered.


Sign in / Sign up

Export Citation Format

Share Document