scholarly journals Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction

Author(s):  
T. N. A. Nguyen ◽  
A. Bouzerdoum ◽  
S. L. Phung
2021 ◽  
pp. 1-36
Author(s):  
Liwei Wang ◽  
Suraj Yerramilli ◽  
Akshay Iyer ◽  
Daniel Apley ◽  
Ping Zhu ◽  
...  

Abstract Scientific and engineering problems often require the use of artificial intelligence to aid understanding and the search for promising designs. While Gaussian processes (GP) stand out as easy-to-use and interpretable learners, they have difficulties in accommodating big datasets, qualitative inputs, and multi-type responses obtained from different simulators, which has become a common challenge for data-driven design applications. In this paper, we propose a GP model that utilizes latent variables and functions obtained through variational inference to address the aforementioned challenges simultaneously. The method is built upon the latent variable Gaussian process (LVGP) model where qualitative factors are mapped into a continuous latent space to enable GP modeling of mixed-variable datasets. By extending variational inference to LVGP models, the large training dataset is replaced by a small set of inducing points to address the scalability issue. Output response vectors are represented by a linear combination of independent latent functions, forming a flexible kernel structure to handle multi-type responses. Comparative studies demonstrate that the proposed method scales well for large datasets, while outperforming state-of-the-art machine learning methods without requiring much hyperparameter tuning. In addition, an interpretable latent space is obtained to draw insights into the effect of qualitative factors, such as those associated with “building blocks” of architectures and element choices in metamaterial and materials design. Our approach is demonstrated for machine learning of ternary oxide materials and topology optimization of a multiscale compliant mechanism with aperiodic microstructures and multiple materials.


2020 ◽  
Vol 639 ◽  
pp. A138 ◽  
Author(s):  
R. H. Leike ◽  
M. Glatzle ◽  
T. A. Enßlin

Aims. Mapping the interstellar medium in 3D provides a wealth of insights into its inner working. The Milky Way is the only galaxy for which detailed 3D mapping can be achieved in principle. In this paper, we reconstruct the dust density in and around the local super-bubble. Methods. The combined data from surveys such as Gaia, 2MASS, PANSTARRS, and ALLWISE provide the necessary information to make detailed maps of the interstellar medium in our surrounding. To this end, we used variational inference and Gaussian processes to model the dust extinction density, exploiting its intrinsic correlations. Results. We reconstructed a highly resolved dust map, showing the nearest dust clouds at a distance of up to 400 pc with a resolution of 1 pc. Conclusions. Our reconstruction provides insights into the structure of the interstellar medium. We compute summary statistics of the spectral index and the 1-point function of the logarithmic dust extinction density, which may constrain simulations of the interstellar medium that achieve a similar resolution.


2021 ◽  
Author(s):  
Liwei Wang ◽  
Suraj Yerramilli ◽  
Akshay Iyer ◽  
Daniel Apley ◽  
Ping Zhu ◽  
...  

Abstract Scientific and engineering problems often require an inexpensive surrogate model to aid understanding and the search for promising designs. While Gaussian processes (GP) stand out as easy-to-use and interpretable learners in surrogate modeling, they have difficulties in accommodating big datasets, qualitative inputs, and multi-type responses obtained from different simulators, which has become a common challenge for a growing number of data-driven design applications. In this paper, we propose a GP model that utilizes latent variables and functions obtained through variational inference to address the aforementioned challenges simultaneously. The method is built upon the latent variable Gaussian process (LVGP) model where qualitative factors are mapped into a continuous latent space to enable GP modeling of mixed-variable datasets. By extending variational inference to LVGP models, the large training dataset is replaced by a small set of inducing points to address the scalability issue. Output response vectors are represented by a linear combination of independent latent functions, forming a flexible kernel structure to handle multi-type responses. Comparative studies demonstrate that the proposed method scales well for large datasets with over 104 data points, while outperforming state-of-the-art machine learning methods without requiring much hyperparameter tuning. In addition, an interpretable latent space is obtained to draw insights into the effect of qualitative factors, such as those associated with “building blocks” of architectures and element choices in metamaterial and materials design. Our approach is demonstrated for machine learning of ternary oxide materials and topology optimization of a multiscale compliant mechanism with aperiodic microstructures and multiple materials.


Sign in / Sign up

Export Citation Format

Share Document