Uncertainty Quantification of Molten Hafnium Infusion Into a B4C Packed-Bed

Author(s):  
Arturo Schiaffino ◽  
V. M. Krushnarao Kotteda ◽  
Vinod Kumar ◽  
Arturo Bronson ◽  
Sanjay Shantha-Kumar

Abstract In the manufacturing of metal matrix composites (MMC), liquid-metal reactive infusion with a solid mesh or particles composed of ceramic or metal may be used. The objective of this study is to determine the uncertainty quantification of the modeling of liquid hafnium infusion to expedite the processing and improve properties of MMCs ultimately. Uncertainty quantification (UQ) characterized the uncertainty scientifically especially for high-performance computing with observed physics and/or chemistry of the phenomena and predicted from estimated parameters. In this work, molten hafnium infusing through a boron carbide packed bed is modeled to optimize the manufacturing of components used for a hypersonic vehicle. The creation of molten matrix composites by the infiltration of molten metal represents a formidable challenge to be accurately modeled. First, the structural randomness associated with porous mediums complicates the prediction of the flow passing through it. Secondly, the properties of the molten metal could vary inside our control volume, since the temperature inside the control volume is not constant. Also, there are several chemical reactions and solidification rates occurring in during the impregnation. Given the recent advances in high-performance computing, an in-house pore network simulator are implemented along with Dakota, an open-source, exascale software, to determine the optimal parameters (e.g., porosity and temperature) and uncertainty quantification for the modeling.

2020 ◽  
Vol 8 ◽  
Author(s):  
Robin A. Richardson ◽  
David W. Wright ◽  
Wouter Edeling ◽  
Vytautas Jancauskas ◽  
Jalal Lakhlili ◽  
...  

MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


Sign in / Sign up

Export Citation Format

Share Document