Physical Interpretation of Machine Learning Models Applied to Film Cooling Flows

Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
John K. Eaton

Current turbulent heat flux models fail to predict accurate temperature distributions in film cooling flows. The present paper focuses on a machine learning approach to this problem, in which the Gradient Diffusion Hypothesis (GDH) is used in conjunction with a data-driven prediction for the turbulent diffusivity field αt. An overview of the model is presented, followed by validation against two film cooling datasets. Despite insufficiencies, the model shows some improvement in the near-injection region. The present work also attempts to interpret the complex machine learning decision process, by analyzing the model features and determining their importance. These results show that the model is heavily reliant of distance to the wall d and eddy viscosity vt, while other features display localized prominence.

2018 ◽  
Vol 141 (1) ◽  
Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
John K. Eaton

Current turbulent heat flux models fail to predict accurate temperature distributions in film cooling flows. The present paper focuses on a machine learning (ML) approach to this problem, in which the gradient diffusion hypothesis (GDH) is used in conjunction with a data-driven prediction for the turbulent diffusivity field αt. An overview of the model is presented, followed by validation against two film cooling datasets. Despite insufficiencies, the model shows some improvement in the near-injection region. The present work also attempts to interpret the complex ML decision process, by analyzing the model features and determining their importance. These results show that the model is heavily reliant of distance to the wall d and eddy viscosity νt, while other features display localized prominence.


Author(s):  
Christopher D. Ellis ◽  
Hao Xia ◽  
Gary J. Page

Abstract A novel data-driven approach is used to describe a spatially varying turbulent diffusivity coefficient for the Higher Order Generalised Gradient Diffusion Hypothesis (HOGGDH) closure of the turbulent heat flux to improve upon RANS cooling predictions in film cooling flows. Machine learning algorithms are trained on two film cooling flows and tested on a case of a different density and blowing ratio. The Random Forests and Neural Network algorithms successfully reproduced the LES described coefficient and the magnitude of the turbulent heat flux vector. The Random Forests model was implemented in a steady RANS solver with a k-ω SST turbulence model and applied to four cases. All cases saw improvements in the predicted Adiabatic Cooling Effectiveness (ACE) over the cooled surface compared to the standard Gradient Diffusion Hypothesis (GDH) approach, but only minor improvements in the centreline and lateral spread are seen compared to a HOGGDH model with a constant cθ of 0.6. Further improvements to cooling predictions are highlighted by extending these data-driven approaches into turbulence modelling to improve flow field predictions.


2017 ◽  
Vol 140 (2) ◽  
Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
Gonzalo Saez-Mischlich ◽  
Julien Bodart ◽  
John K. Eaton

In film cooling flows, it is important to know the temperature distribution resulting from the interaction between a hot main flow and a cooler jet. However, current Reynolds-averaged Navier–Stokes (RANS) models yield poor temperature predictions. A novel approach for RANS modeling of the turbulent heat flux is proposed, in which the simple gradient diffusion hypothesis (GDH) is assumed and a machine learning (ML) algorithm is used to infer an improved turbulent diffusivity field. This approach is implemented using three distinct data sets: two are used to train the model and the third is used for validation. The results show that the proposed method produces significant improvement compared to the common RANS closure, especially in the prediction of film cooling effectiveness.


Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
Gonzalo Saez-Mischlich ◽  
Julien Bodart ◽  
John K. Eaton

In film cooling flows, it is important to know the temperature distribution resulting from the interaction between a hot main flow and a cooler jet. However, current Reynolds-averaged Navier-Stokes (RANS) models yield poor temperature predictions. A novel approach for RANS modeling of the turbulent heat flux is proposed, in which the simple gradient diffusion hypothesis (GDH) is assumed and a machine learning algorithm is used to infer an improved turbulent diffusivity field. This approach is implemented using three distinct data sets: two are used to train the model and the third is used for validation. The results show that the proposed method produces significant improvement compared to the common RANS closure, especially in the prediction of film cooling effectiveness.


Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
John K. Eaton

Abstract The design of film cooling systems relies heavily on Reynolds-Averaged Navier-Stokes (RANS) simulations, which solve for mean quantities and model all turbulent scales. Most turbulent heat flux models, which are based on isotropic diffusion with a fixed turbulent Prandtl number (Prt), fail to accurately predict heat transfer in film cooling flows. In the present work, machine learning models are trained to predict a non-uniform Prt field, using various datasets as training sets. The ability of these models to generalize beyond the flows on which they were trained is explored. Furthermore, visualization techniques are employed to compare distinct datasets and to help explain the cross-validation results.


2019 ◽  
Vol 142 (1) ◽  
Author(s):  
Pedro M. Milani ◽  
Julia Ling ◽  
John K. Eaton

Abstract The design of film cooling systems relies heavily on Reynolds-averaged Navier–Stokes (RANS) simulations, which solve for mean quantities and model all turbulent scales. Most turbulent heat flux models, which are based on isotropic diffusion with a fixed turbulent Prandtl number (Prt), fail to accurately predict heat transfer in film cooling flows. In the present work, machine learning models are trained to predict a non-uniform Prt field using various datasets as training sets. The ability of these models to generalize beyond the flows on which they were trained is explored. Furthermore, visualization techniques are employed to compare distinct datasets and to help explain the cross-validation results.


Author(s):  
Fabíola Paula Costa ◽  
Rubén Bruno Díaz ◽  
Pedro M. Milani ◽  
Jesuíno Takachi Tomita ◽  
Cleverson Bringhenti

Abstract Film cooling is an important technique to ensure safe operation and performance fulfillment of turbines. Its ultimate goal is to protect the axial turbine blades from high gas temperatures. An appropriate study is necessary in order to obtain a reliable representation of the flow characteristics involved in such phenomena. Because of the high computational cost of high-fidelity simulations, the low-fidelity simulation method Reynolds Averaged Navier Stokes (RANS) is commonly used in practical configurations. However, the majority of the current turbulent heat flux models fail to accurately predict heat transfer in film cooling flows. Recent work suggests the use of machine learning models to improve turbulent closure in these flows. In the present work, a machine learning model for spatially varying turbulent Prandtl number previously described in the literature is applied to a transverse film cooling flow consisting of a jet square channel. The results obtained in the present work were compared to adiabatic effectiveness experimental data available in the literature to assess the performance of the machine learning model. The results shown that for low blowing ratios (BR = 0.2 and BR = 0.4) the proposed machine learning model has poor performance. However, for the case with the highest blowing ratio (BR = 0.8), the proposed model presented better results. These results are then explained in terms of the resulting turbulent Prandtl number field and suggest that the training set is not appropriate for capturing the turbulent heat flux in fully attached jets in crossflow.


1992 ◽  
Vol 114 (1) ◽  
pp. 33-38 ◽  
Author(s):  
J. C. Pan ◽  
W. J. Schmoll ◽  
D. R. Ballal

Turbulence properties were investigated in and around the recirculation zone produced by a 45 deg conical flame stabilizer of 25 percent blockage ratio confined in a pipe supplied with a turbulent premixed methane-air mixture at a Reynolds number of 5.7×104. A three-component LDA system was used for measuring mean velocities, turbulence intensities, Reynolds stresses, skewness, kurtosis, and turbulent kinetic energy. It was found that wall confinement elongates the recirculation zone by accelerating the flow and narrows it by preventing mean streamline curvature. For confined flames, turbulence production is mainly due to shear stress-mean strain interaction. In the region of maximum recirculation zone width and around the stagnation point, the outer stretched flame resembles a normal mixing layer and gradient-diffusion closure for velocity holds. However, and in the absence of turbulent heat flux data, countergradient diffusion cannot be ruled out. Finally, and because of the suppression of mean streamline curvature by confinement, in combusting flow, the production of turbulence is only up to 33 percent of its damping due to dilatation and dissipation.


Sign in / Sign up

Export Citation Format

Share Document