Metallic thin films heated by pulsed lasers. Numerical simulation of the thermal field and the melting kinetics
This modeling is especially applied to the pulsed laser induced heating and melting of a metallic film deposited on a substrate. Study of the thermal field over a surface is usually performed by considering the assumption of ‘semi-infinite medium’. However, a thin film deposited on a rough substrate surface induces bad thermal contacts commonly known as ‘thermal contact resistance’. This interfacial thermal resistance affects the melting kinetics mainly when the film thickness (Z) is small comparatively to the heat diffusion length (ZT). In this work the heat conduction equation and related boundary conditions are resolved by using the implicit finite differences method. The heat source (i.e. the laser intensity) is treated as a surface boundary layer. The thermal contact resistance is introduced in the computation procedure when the heat wave propagation reaches the thin film/substrate interface. It is then possible to calculate the critical temperatures and the melting threshold fluence for high and low contact resistance values. Under these conditions, the temperature profile and melting depth are plotted considering different thickness.. Finally, for 750 mJ/cm² excimer laser fluence and 0.1 cm²/s thin film apparent diffusivity results show that for Z/ZT higher than 0.5, there is no sensitive effect of the thermal contact resistance on the melting kinetics.