Knowledge Framework for Deep Learning: Congenital Heart Disease

Author(s):  
Ritu Chauhan ◽  
Harleen Kaur
2020 ◽  
Vol 5 (4) ◽  
pp. 449 ◽  
Author(s):  
Shuhei Toba ◽  
Yoshihide Mitani ◽  
Noriko Yodoya ◽  
Hiroyuki Ohashi ◽  
Hirofumi Sawada ◽  
...  

Heart ◽  
2020 ◽  
Vol 106 (13) ◽  
pp. 960-961
Author(s):  
Rhodri Davies ◽  
Sonya V Babu-Narayan

2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Gerhard-Paul Diller ◽  
◽  
Julius Vahle ◽  
Robert Radke ◽  
Maria Luisa Benesch Vidal ◽  
...  

Abstract Background Deep learning algorithms are increasingly used for automatic medical imaging analysis and cardiac chamber segmentation. Especially in congenital heart disease, obtaining a sufficient number of training images and data anonymity issues remain of concern. Methods Progressive generative adversarial networks (PG-GAN) were trained on cardiac magnetic resonance imaging (MRI) frames from a nationwide prospective study to generate synthetic MRI frames. These synthetic frames were subsequently used to train segmentation networks (U-Net) and the quality of the synthetic training images, as well as the performance of the segmentation network was compared to U-Net-based solutions trained entirely on patient data. Results Cardiac MRI data from 303 patients with Tetralogy of Fallot were used for PG-GAN training. Using this model, we generated 100,000 synthetic images with a resolution of 256 × 256 pixels in 4-chamber and 2-chamber views. All synthetic samples were classified as anatomically plausible by human observers. The segmentation performance of the U-Net trained on data from 42 separate patients was statistically significantly better compared to the PG-GAN based training in an external dataset of 50 patients, however, the actual difference in segmentation quality was negligible (< 1% in absolute terms for all models). Conclusion We demonstrate the utility of PG-GANs for generating large amounts of realistically looking cardiac MRI images even in rare cardiac conditions. The generated images are not subject to data anonymity and privacy concerns and can be shared freely between institutions. Training supervised deep learning segmentation networks on this synthetic data yielded similar results compared to direct training on original patient data.


Author(s):  
Rima Arnaout ◽  
Lara Curran ◽  
Yili Zhao ◽  
Jami C. Levine ◽  
Erin Chinn ◽  
...  

AbstractCongenital heart disease (CHD) is the most common birth defect. Fetal survey ultrasound is recommended worldwide, including five views of the heart that together could detect 90% of complex CHD. In practice, however, sensitivity is as low as 30%. We hypothesized poor detection results from challenges in acquiring and interpreting diagnostic-quality cardiac views, and that deep learning could improve complex CHD detection. Using 107,823 images from 1,326 retrospective echocardiograms and surveys from 18-24 week fetuses, we trained an ensemble of neural networks to (i) identify recommended cardiac views and (ii) distinguish between normal hearts and complex CHD. Finally, (iii) we used segmentation models to calculate standard fetal cardiothoracic measurements. In a test set of 4,108 fetal surveys (0.9% CHD, >4.4 million images, about 400 times the size of the training dataset) the model achieved an AUC of 0.99, 95% sensitivity (95%CI, 84-99), 96% specificity (95%CI, 95-97), and 100% NPV in distinguishing normal from abnormal hearts. Sensitivity was comparable to clinicians’ task-for-task and remained robust on external and lower-quality images. The model’s decisions were based on clinically relevant features. Cardiac measurements correlated with reported measures for normal and abnormal hearts. Applied to guidelines-recommended imaging, ensemble learning models could significantly improve detection of fetal CHD and expand telehealth options for prenatal care at a time when the COVID-19 pandemic has further limited patient access to trained providers. This is the first use of deep learning to ∼double standard clinical performance on a critical and global diagnostic challenge.


Sign in / Sign up

Export Citation Format

Share Document