Data fusion for high resolution fluorescence lifetime imaging using deep learning.

Author(s):  
Valentin Kapitany ◽  
Alex Turpin ◽  
Jamie Whitelaw ◽  
Ewan McGhee ◽  
Robert Insall ◽  
...  
Author(s):  
Jason T. Smith ◽  
Ruoyang Yao ◽  
Sez-Jade Chen ◽  
Nattawut Sinsuebphon ◽  
Alena Rudkouskaya ◽  
...  

1998 ◽  
Author(s):  
Mark J. Dayel ◽  
Keith Dowling ◽  
Sam C. W. Hyde ◽  
Christopher Dainty ◽  
Paul M. W. French ◽  
...  

2019 ◽  
Author(s):  
Jason T. Smith ◽  
Ruoyang Yao ◽  
Nattawut Sinsuebphon ◽  
Alena Rudkouskaya ◽  
Joseph Mazurkiewicz ◽  
...  

AbstractFluorescence lifetime imaging (FLI) provides unique quantitative information in biomedical and molecular biology studies, but relies on complex data fitting techniques to derive the quantities of interest. Herein, we propose a novel fit-free approach in FLI image formation that is based on Deep Learning (DL) to quantify complex fluorescence decays simultaneously over a whole image and at ultra-fast speeds. Our deep neural network (DNN), named FLI-Net, is designed and model-based trained to provide all lifetime-based parameters that are typically employed in the field. We demonstrate the accuracy and generalizability of FLI-Net by performing quantitative microscopic and preclinical experimental lifetime-based studies across the visible and NIR spectra, as well as across the two main data acquisition technologies. Our results demonstrate that FLI-Net is well suited to quantify complex fluorescence lifetimes, accurately, in real time in cells and intact animals without any parameter settings. Hence, it paves the way to reproducible and quantitative lifetime studies at unprecedented speeds, for improved dissemination and impact of FLI in many important biomedical applications, especially in clinical settings.


2019 ◽  
Vol 116 (48) ◽  
pp. 24019-24030 ◽  
Author(s):  
Jason T. Smith ◽  
Ruoyang Yao ◽  
Nattawut Sinsuebphon ◽  
Alena Rudkouskaya ◽  
Nathan Un ◽  
...  

Fluorescence lifetime imaging (FLI) provides unique quantitative information in biomedical and molecular biology studies but relies on complex data-fitting techniques to derive the quantities of interest. Herein, we propose a fit-free approach in FLI image formation that is based on deep learning (DL) to quantify fluorescence decays simultaneously over a whole image and at fast speeds. We report on a deep neural network (DNN) architecture, named fluorescence lifetime imaging network (FLI-Net) that is designed and trained for different classes of experiments, including visible FLI and near-infrared (NIR) FLI microscopy (FLIM) and NIR gated macroscopy FLI (MFLI). FLI-Net outputs quantitatively the spatially resolved lifetime-based parameters that are typically employed in the field. We validate the utility of the FLI-Net framework by performing quantitative microscopic and preclinical lifetime-based studies across the visible and NIR spectra, as well as across the 2 main data acquisition technologies. These results demonstrate that FLI-Net is well suited to accurately quantify complex fluorescence lifetimes in cells and, in real time, in intact animals without any parameter settings. Hence, FLI-Net paves the way to reproducible and quantitative lifetime studies at unprecedented speeds, for improved dissemination and impact of FLI in many important biomedical applications ranging from fundamental discoveries in molecular and cellular biology to clinical translation.


2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Chao Liu ◽  
Xinwei Wang ◽  
Yan Zhou ◽  
Yuliang Liu

Steady-state fluorence imaging and time-resolved fluorescence imaging are two important areas in fluorescence imaging research. Fluorescence lifetime imaging is an absolute measurement method which is independent of excitation laser intensity, fluorophore concentration, and photobleaching compared to fluorescence intensity imaging techniques. Time-gated fluorescence lifetime imaging microscopy (FLIM) can provide high resolution and high imaging frame during mature FLIM methods. An abstract time-gated FLIM model was given, and important temporal parameters are shown as well. Aiming at different applications of steady and transient fluorescence processes, two different operation modes, timing and lifetime computing algorithm are designed. High resolution and high frame can be achieved by one-excitation one-sampling mode and least square algorithm for steady imaging applications. Correspondingly, one-excitation two-sampling mode and rapid lifetime determination algorithm contribute to transient fluorescence situations.


2020 ◽  
Author(s):  
Nicholas L. Weilinger ◽  
Jeffrey M. LeDue ◽  
Kristopher T. Kahle ◽  
Brian A. MacVicar

AbstractIntracellular chloride ion ([Cl−]i) homeostasis is critical for synaptic neurotransmission yet variations in subcellular domains are poorly understood owing to difficulties in obtaining quantitative, high-resolution measurements of dendritic [Cl−]i. We combined whole-cell patch clamp electrophysiology with simultaneous fluorescence lifetime imaging (FLIM) of the Cl− dye MQAE to quantitatively map dendritic Cl− levels in normal or pathological conditions. FLIM-based [Cl−]i estimates were corroborated by Rubi-GABA uncaging to measured EGABA. Low baseline [Cl-]i in dendrites required Cl− efflux via the K+-Cl− cotransporter KCC2 (SLC12A5). In contrast, pathological NMDA application generated spatially heterogeneous subdomains of high [Cl−]i that created dendritic blebs, a signature of ischemic stroke. These discrete regions of high [Cl−]i were caused by reversed KCC2 transport. Therefore monitoring [Cl−]i microdomains with a new high resolution FLIM-based technique identified novel roles for KCC2-dependent chloride transport to generate dendritic microdomains with implications for disease.


2020 ◽  
Author(s):  
Yuan-I Chen ◽  
Yin-Jui Chang ◽  
Shih-Chu Liao ◽  
Trung Duc Nguyen ◽  
Jianchen Yang ◽  
...  

AbstractFluorescence lifetime imaging microscopy (FLIM) is a powerful tool to quantify molecular compositions and study the molecular states in the complex cellular environment as the lifetime readings are not biased by the fluorophore concentration or the excitation power. However, the current methods to generate FLIM images are either computationally intensive or unreliable when the number of photons acquired at each pixel is low. Here we introduce a new deep learning-based method termed flimGANE (fluorescence lifetime imaging based on Generative Adversarial Network Estimation) that can rapidly generate accurate and high-quality FLIM images even in the photon-starved conditions. We demonstrated our model is not only 258 times faster than the most popular time-domain least-square estimation (TD_LSE) method but also provide more accurate analysis in barcode identification, cellular structure visualization, Förster resonance energy transfer characterization, and metabolic state analysis. With its advantages in speed and reliability, flimGANE is particularly useful in fundamental biological research and clinical applications, where ultrafast analysis is critical.


Sign in / Sign up

Export Citation Format

Share Document