CFD Predictions of Heat Transfer Coefficient Augmentation on a Simulated Film Cooled Turbine Blade Leading Edge
Many experimental studies of the augmentation of the heat transfer coefficients due to film cooling jet injection have been done with the coolant at mainstream temperature because this improves the accuracy of the measurements. However, for typical engine conditions the coolant is generally much colder than the mainstream with a significantly higher density. It is generally presumed that the density of the coolant has negligible effect on the augmentation of the heat transfer coefficient due to coolant injection. In this study, the effects of coolant density on heat transfer coefficient augmentation were studied computationally. The focus was on a simulation of a turbine blade leading edge where augmentation of the heat transfer coefficient can be as much as factor of two. The realizable k-ε turbulence model (RKE) and Shear Stress Transport k-ω turbulence model (SST) were used in these computational simulations. The RKE computations completed at a unity density ratio were found to be similar to previous experimental measurements, whereas SST computations exhibited significant discrepancies. Simulations with coolant density ratios varying from 1.0 to 1.5 showed that heat transfer coefficient augmentation can be simulated using unity density ratio jets, but only when scaled with the momentum flux ratio of the coolant jets.