scholarly journals Non-intrusive data-driven ROM framework for hemodynamics problems

Author(s):  
M. Girfoglio ◽  
L. Scandurra ◽  
F. Ballarin ◽  
G. Infantino ◽  
F. Nicolo ◽  
...  

AbstractReduced order modeling (ROM) techniques are numerical methods that approximate the solution of parametric partial differential equation (PED) by properly combining the high-fidelity solutions of the problem obtained for several configurations, i.e. for several properly chosen values of the physical/geometrical parameters characterizing the problem. By starting from a database of high-fidelity solutions related to a certain values of the parameters, we apply the proper orthogonal decomposition with interpolation (PODI) and then reconstruct the variables of interest for new values of the parameters, i.e. different values from the ones included in the database. Furthermore, we present a preliminary web application through which one can run the ROM with a very user-friendly approach, without the need of having expertise in the numerical analysis and scientific computing field. The case study we have chosen to test the efficiency of our algorithm is represented by the aortic blood flow pattern in presence of a left ventricular (LVAD) assist device when varying the pump flow rate.

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Roohollah Noori ◽  
Fuqiang Tian ◽  
Guangheng Ni ◽  
Rabin Bhattarai ◽  
Farhad Hooshyaripor ◽  
...  

AbstractThis study presents a novel tool, ThSSim, for simulation of thermal stratification (ThS) in reservoirs. ThSSim is a simple and flexible reduced-order model-based the basis function (RMBF) that combines CE-QUAL-W2 (W2) and proper orthogonal decomposition (POD). In a case study, it was used to simulate water temperature in the Karkheh Reservoir (KR), Iran, for the period 2019–2035. ThSSim consists of two space- and time-dependent components that add predictive ability to the RMBF, a major refinement that extends its practical applications. Water temperature simulations by the W2 model at three-hour time intervals for the KR were used as input data to the POD model to develop ThSSim. To add predictive ability to ThSSim and considering that space-dependent components are not a function of time, we extrapolated the first three time-dependent components by September 30, 2035. We checked the predictive ability of ThSSim against water temperature profiles measured during eight sampling campaigns. We then applied ThSSim to simulate water temperature in the KR for 2019–2035. Simulated water temperature values matched well those measured and obtained by W2. ThSSim results showed an increasing trend for surface water temperature during the simulation period, with a reverse trend observed for water temperature in the bottom layers for three seasons (spring, summer and autumn). The results also indicated decreasing and increasing trends in onset and breakdown of thermal stability, respectively, so that the duration of ThS increased from 278 days in 2019 to 293 days in 2035. ThSSim is thus useful for reservoir temperature simulations. Moreover, the approach used to develop ThSSim is widely applicable to other fields of science and engineering.


2010 ◽  
Vol 132 (6) ◽  
Author(s):  
Arun P. Raghupathy ◽  
Urmila Ghia ◽  
Karman Ghia ◽  
William Maltz

This technical note presents an introduction to boundary-condition-independent reduced-order modeling of complex electronic components using the proper orthogonal decomposition (POD)-Galerkin approach. The current work focuses on how the POD methodology can be used along with the finite volume method to generate reduced-order models that are independent of their boundary conditions. The proposed methodology is demonstrated for the transient 1D heat equation, and preliminary results are presented.


Author(s):  
Nathan Rolander ◽  
Jeffrey Rambo ◽  
Yogendra Joshi ◽  
Farrokh Mistree

Thermal management difficulties in data centers caused by the rapidly increasing power densities of modern computational equipment are compounded by the less frequent upgrading of the facilities themselves, creating a lifecycle mismatch. This paper utilizes the Proper Orthogonal Decompostion (POD) for reduced order modeling of turbulent convection integrated with the compromise Decision Support Problem (DSP) using robust design principles. In the illustrative case study considered thermally efficient cabinet configurations that are insensitive to variations in operating conditions are determined. Results of the application of this approach to an enclosed cabinet example show that the resulting thermally efficient configurations are capable of dissipating up to a 50% greater heat load using the same cooling infrastructure.


Author(s):  
Eivind Fonn ◽  
Adil Rasheed ◽  
Mandar Tabib ◽  
Trond Kvamsdal

High fidelity simulations of flow might be quite demanding, because they involve up to O(106 – 109) degrees of freedom and several hours (or even days) of computational time, also on powerful hardware parallel architectures. Thus, high-fidelity techniques can become prohibitive when we expect them to deal quickly and efficiently with the repetitive solution of partial differential equations. One set of partial differential equation that we encounter on a regular basis is the Navier Stokes Equation which is used to simulate flow around complex geometries like sub-sea structures. To address the issues associated with computational efficiency, a field of Reduced Order Modelling is evolving fast. In this paper we investigate Proper Orthogonal Decomposition as a potential method for constructing reduced bases for Reduced Order Models. In the case of flows around cylindrical bodies we found that only a few modes were sufficient to represent the dominant flow structures and energies associated with them making POD to be an attractive candidate for bases construction.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0262145
Author(s):  
Olatunji Johnson ◽  
Claudio Fronterre ◽  
Peter J. Diggle ◽  
Benjamin Amoah ◽  
Emanuele Giorgi

User-friendly interfaces have been increasingly used to facilitate the learning of advanced statistical methodology, especially for students with only minimal statistical training. In this paper, we illustrate the use of MBGapp for teaching geostatistical analysis to population health scientists. Using a case-study on Loa loa infections, we show how MBGapp can be used to teach the different stages of a geostatistical analysis in a more interactive fashion. For wider accessibility and usability, MBGapp is available as an R package and as a Shiny web-application that can be freely accessed on any web browser. In addition to MBGapp, we also present an auxiliary Shiny app, called VariagramApp, that can be used to aid the teaching of Gaussian processes in one and two dimensions using simulations.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
S. Olaechea-Lázaro ◽  
I. García-Santisteban ◽  
J. R. Pineda ◽  
I. Badiola ◽  
S. Alonso ◽  
...  

Abstract Background Quantitative, reverse transcription PCR (qRT-PCR) is currently the gold-standard for SARS-CoV-2 detection and it is also used for detection of other virus. Manual data analysis of a small number of qRT-PCR plates per day is a relatively simple task, but automated, integrative strategies are needed if a laboratory is dealing with hundreds of plates per day, as is being the case in the COVID-19 pandemic. Results Here we present shinyCurves, an online shiny-based, free software to analyze qRT-PCR amplification data from multi-plate and multi-platform formats. Our shiny application does not require any programming experience and is able to call samples Positive, Negative or Undetermined for viral infection according to a number of user-defined settings, apart from providing a complete set of melting and amplification curve plots for the visual inspection of results. Conclusions shinyCurves is a flexible, integrative and user-friendly software that speeds-up the analysis of massive qRT-PCR data from different sources, with the possibility of automatically producing and evaluating melting and amplification curve plots.


2015 ◽  
Vol 2015 (1-2) ◽  
pp. 107-120
Author(s):  
Uwe C. Steiner

Simmel’s Sociology explores elementary processes of socialization or collectivization. Thus, the sociology of the senses examines how sight, hearing, feeling, smelling and tasting contributes to constituting societies. Though Simmel observes that modern refined civilization diminishes the depths of the senses but increases its emphasis or enhancement with lust or aversion, the conclusion cannot be avoided that the artifacts and technologies of hearing have to be examined. Accordingly, this article can be regarded as a case study in the wake of Simmel: How do modern aural technologies at the threshold between high fidelity and postfidelity inform contemporary hearing?


2016 ◽  
Vol 167 (5) ◽  
pp. 294-301
Author(s):  
Leo Bont

Optimal layout of a forest road network The road network is the backbone of forest management. When creating or redesigning a forest road network, one important question is how to shape the layout, this means to fix the spatial arrangement and the dimensioning standard of the roads. We consider two kinds of layout problems. First, new forest road network in an area without any such development yet, and second, redesign of existing road network for actual requirements. For each problem situation, we will present a method that allows to detect automatically the optimal road and harvesting layout. The method aims to identify a road network that concurrently minimizes the harvesting cost, the road network cost (construction and maintenance) and the hauling cost over the entire life cycle. Ecological issues can be considered as well. The method will be presented and discussed with the help of two case studies. The main benefit of the application of optimization tools consists in an objective-based planning, which allows to check and compare different scenarios and objectives within a short time. The responses coming from the case study regions were highly positive: practitioners suggest to make those methods a standard practice and to further develop the prototype to a user-friendly expert software.


2020 ◽  
Author(s):  
Christian Amor ◽  
José M Pérez ◽  
Philipp Schlatter ◽  
Ricardo Vinuesa ◽  
Soledad Le Clainche

Abstract This article introduces some soft computing methods generally used for data analysis and flow pattern detection in fluid dynamics. These techniques decompose the original flow field as an expansion of modes, which can be either orthogonal in time (variants of dynamic mode decomposition), or in space (variants of proper orthogonal decomposition) or in time and space (spectral proper orthogonal decomposition), or they can simply be selected using some sophisticated statistical techniques (empirical mode decomposition). The performance of these methods is tested in the turbulent wake of a wall-mounted square cylinder. This highly complex flow is suitable to show the ability of the aforementioned methods to reduce the degrees of freedom of the original data by only retaining the large scales in the flow. The main result is a reduced-order model of the original flow case, based on a low number of modes. A deep discussion is carried out about how to choose the most computationally efficient method to obtain suitable reduced-order models of the flow. The techniques introduced in this article are data-driven methods that could be applied to model any type of non-linear dynamical system, including numerical and experimental databases.


2021 ◽  
Vol 22 (S2) ◽  
Author(s):  
Daniele D’Agostino ◽  
Pietro Liò ◽  
Marco Aldinucci ◽  
Ivan Merelli

Abstract Background High-throughput sequencing Chromosome Conformation Capture (Hi-C) allows the study of DNA interactions and 3D chromosome folding at the genome-wide scale. Usually, these data are represented as matrices describing the binary contacts among the different chromosome regions. On the other hand, a graph-based representation can be advantageous to describe the complex topology achieved by the DNA in the nucleus of eukaryotic cells. Methods Here we discuss the use of a graph database for storing and analysing data achieved by performing Hi-C experiments. The main issue is the size of the produced data and, working with a graph-based representation, the consequent necessity of adequately managing a large number of edges (contacts) connecting nodes (genes), which represents the sources of information. For this, currently available graph visualisation tools and libraries fall short with Hi-C data. The use of graph databases, instead, supports both the analysis and the visualisation of the spatial pattern present in Hi-C data, in particular for comparing different experiments or for re-mapping omics data in a space-aware context efficiently. In particular, the possibility of describing graphs through statistical indicators and, even more, the capability of correlating them through statistical distributions allows highlighting similarities and differences among different Hi-C experiments, in different cell conditions or different cell types. Results These concepts have been implemented in NeoHiC, an open-source and user-friendly web application for the progressive visualisation and analysis of Hi-C networks based on the use of the Neo4j graph database (version 3.5). Conclusion With the accumulation of more experiments, the tool will provide invaluable support to compare neighbours of genes across experiments and conditions, helping in highlighting changes in functional domains and identifying new co-organised genomic compartments.


Sign in / Sign up

Export Citation Format

Share Document