scholarly journals Combination of 2D Compressive Sensing Spectral Domain Optical Coherence Tomography and Interferometric Synthetic Aperture Microscopy

2019 ◽  
Vol 9 (19) ◽  
pp. 4003
Author(s):  
Luying Yi ◽  
Liqun Sun ◽  
Xiangyu Guo ◽  
Bo Hou

Combining the advantages of compressive sensing spectral domain optical coherence tomography (CS-SDOCT) and interferometric synthetic aperture microscopy (ISAM) in terms of data volume, imaging speed, and lateral resolution, we demonstrated how compressive sampling and ISAM can be simultaneously used to reconstruct an optical coherence tomography (OCT) image. Specifically, an OCT image is reconstructed from two-dimensional (2D) under-sampled spectral data dimension-by-dimension through a CS reconstruction algorithm. During the iterative process of CS algorithm, the deterioration of lateral resolution beyond the depth of focus (DOF) of a Gaussian beam is corrected. In the end, with less spectral data, we can obtain an OCT image with spatially invariant lateral resolution throughout the imaging depth. This method was verified in this paper by imaging the cells of an orange. A 0.7 × 1.5 mm image of an orange was reconstructed using only 50% × 50% spectral data, in which the dispersion of the structure was decreased by approximately 2.4 times at a depth of approximately 5.7 Rayleigh ranges above the focus. This result was consistent with that obtained with 100% data.

Sensors ◽  
2019 ◽  
Vol 19 (19) ◽  
pp. 4208
Author(s):  
Luying Yi ◽  
Xiangyu Guo ◽  
Liqun Sun ◽  
Bo Hou

In this paper, a full depth 2D CS-SDOCT approach is proposed, which combines two-dimensional (2D) compressive sensing spectral-domain optical coherence tomography (CS-SDOCT) and dispersion encoding (ED) technologies, and its applications in structural imaging and functional sensing of bio-tissues are studied. Specifically, by introducing a large dispersion mismatch between the reference arm and sample arm in SD-OCT system, the reconstruction of the under-sampled A-scan data and the removal of the conjugated images can be achieved simultaneously by only two iterations. The under-sampled B-scan data is then reconstructed using the classic CS reconstruction algorithm. For a 5 mm × 3.2 mm fish-eye image, the conjugated image was reduced by 31.4 dB using 50% × 50% sampled data (250 depth scans and 480 spectral sampling points per depth scan), and all A-scan data was reconstructed in only 1.2 s. In addition, we analyze the application performance of the CS-SDOCT in functional sensing of locally homogeneous tissue. Simulation and experimental results show that this method can correctly reconstruct the extinction coefficient spectrum under reasonable iteration times. When 8 iterations were used to reconstruct the A-scan data in the imaging experiment of fisheye, the extinction coefficient spectrum calculated using 50% × 50% data was approximately consistent with that obtained with 100% data.


2020 ◽  
Vol 104 (12) ◽  
pp. 1717-1723 ◽  
Author(s):  
Jinho Lee ◽  
Jin-Soo Kim ◽  
Haeng Jin Lee ◽  
Seong-Joon Kim ◽  
Young Kook Kim ◽  
...  

Background/aimsTo assess the performance of a deep learning classifier for differentiation of glaucomatous optic neuropathy (GON) from compressive optic neuropathy (CON) based on ganglion cell–inner plexiform layer (GCIPL) and retinal nerve fibre layer (RNFL) spectral-domain optical coherence tomography (SD-OCT).MethodsEighty SD-OCT image sets from 80 eyes of 80 patients with GON along with 81 SD-OCT image sets from 54 eyes of 54 patients with CON were compiled for the study. The bottleneck features extracted from the GCIPL thickness map, GCIPL deviation map, RNFL thickness map and RNFL deviation map were used as predictors for the deep learning classifier. The area under the receiver operating characteristic curve (AUC) was calculated to validate the diagnostic performance. The AUC with the deep learning classifier was compared with those for conventional diagnostic parameters including temporal raphe sign, SD-OCT thickness profile and standard automated perimetry.ResultsThe deep learning system achieved an AUC of 0.990 (95% CI 0.982 to 0.999) with a sensitivity of 97.9% and a specificity of 92.6% in a fivefold cross-validation testing, which was significantly larger than the AUCs with the other parameters: 0.804 (95% CI 0.737 to 0.872) with temporal raphe sign, 0.815 (95% CI 0.734 to 0.896) with superonasal GCIPL and 0.776 (95% CI 0.691 to 0.860) with superior GCIPL thicknesses (all p<0.001).ConclusionThe deep learning classifier can outperform the conventional diagnostic parameters for discrimination of GON and CON on SD-OCT.


2007 ◽  
Vol 24 (9) ◽  
pp. 2527 ◽  
Author(s):  
Brynmor J. Davis ◽  
Simon C. Schlachter ◽  
Daniel L. Marks ◽  
Tyler S. Ralston ◽  
Stephen A. Boppart ◽  
...  

2010 ◽  
Vol 35 (10) ◽  
pp. 1683 ◽  
Author(s):  
Tyler S. Ralston ◽  
Steven G. Adie ◽  
Daniel L. Marks ◽  
Stephen A. Boppart ◽  
P. Scott Carney

Sign in / Sign up

Export Citation Format

Share Document