Study of spatial frequency domain imaging technique for turbid media optical property estimation and application

Author(s):  
Xiaping Fu ◽  
Xu Jiang ◽  
Liyu Dai ◽  
Yifeng Luo
2018 ◽  
Vol 9 (2) ◽  
pp. 661 ◽  
Author(s):  
Vivian Pera ◽  
Kavon Karrobi ◽  
Syeda Tabassum ◽  
Fei Teng ◽  
Darren Roblyer

2011 ◽  
Author(s):  
John Quan Nguyen ◽  
Rolf B. Saager ◽  
David J. Cuccia ◽  
Kristen M. Kelly ◽  
David Hsiang ◽  
...  

Photonics ◽  
2021 ◽  
Vol 8 (8) ◽  
pp. 310
Author(s):  
Ben O. L. Mellors ◽  
Hamid Dehghani

Spatial frequency domain imaging (SFDI) utilizes the projection of spatially modulated light patterns upon biological tissues to obtain optical property maps for absorption and reduced scattering. Conventionally, both forward modeling and optical property recovery are performed using pixel-independent models, calculated via analytical solutions or Monte-Carlo-based look-up tables, both assuming a homogenous medium. The resulting recovered maps are limited for samples of high heterogeneity, where the homogenous assumption is not valid. NIRFAST, a FEM-based image modeling and reconstruction tool, simulates complex heterogeneous tissue optical interactions for single and multiwavelength systems. Based on the diffusion equation, NIRFAST has been adapted to perform pixel-dependent forward modeling for SFDI. Validation is performed within the spatially resolved domain, along with homogenous structured illumination simulations, with a recovery error of <2%. Heterogeneity is introduced through cylindrical anomalies, varying size, depth and optical property values, with recovery errors of <10%, as observed across a variety of simulations. This work demonstrates the importance of pixel-dependent light interaction modeling for SFDI and its role in quantitative accuracy. Here, a full raw image SFDI modeling tool is presented for heterogeneous samples, providing a mechanism towards a pixel-dependent SFDI image modeling and parameter recovery system.


Sign in / Sign up

Export Citation Format

Share Document