scholarly journals A Training Set Subsampling Strategy for the Reduced Basis Method

2021 ◽  
Vol 89 (3) ◽  
Author(s):  
Sridhar Chellappa ◽  
Lihong Feng ◽  
Peter Benner

AbstractWe present a subsampling strategy for the offline stage of the Reduced Basis Method. The approach is aimed at bringing down the considerable offline costs associated with using a finely-sampled training set. The proposed algorithm exploits the potential of the pivoted QR decomposition and the discrete empirical interpolation method to identify important parameter samples. It consists of two stages. In the first stage, we construct a low-fidelity approximation to the solution manifold over a fine training set. Then, for the available low-fidelity snapshots of the output variable, we apply the pivoted QR decomposition or the discrete empirical interpolation method to identify a set of sparse sampling locations in the parameter domain. These points reveal the structure of the parametric dependence of the output variable. The second stage proceeds with a subsampled training set containing a by far smaller number of parameters than the initial training set. Different subsampling strategies inspired from recent variants of the empirical interpolation method are also considered. Tests on benchmark examples justify the new approach and show its potential to substantially speed up the offline stage of the Reduced Basis Method, while generating reliable reduced-order models.

2014 ◽  
Vol 36 (1) ◽  
pp. A168-A192 ◽  
Author(s):  
Benjamin Peherstorfer ◽  
Daniel Butnaru ◽  
Karen Willcox ◽  
Hans-Joachim Bungartz

Sign in / Sign up

Export Citation Format

Share Document