Patch-based 3D U-Net and transfer learning for longitudinal piglet brain segmentation on MRI

Author(s):  
P. Coupeau ◽  
J.-B. Fasquel ◽  
E. Mazerand ◽  
P. Menei ◽  
C.N. Montero-Menei ◽  
...  
Author(s):  
Camilo Bermudez ◽  
Justin Blaber ◽  
Samuel W. Remedios ◽  
Jess E. Reynolds ◽  
Catherine Lebel ◽  
...  

2020 ◽  
Vol 7 (06) ◽  
Author(s):  
Camilo Bermudez ◽  
Samuel W. Remedios ◽  
Karthik Ramadass ◽  
Maureen McHugo ◽  
Stephan Heckers ◽  
...  

2020 ◽  
Author(s):  
Yun Wang ◽  
Fateme Sadat Haghpanah ◽  
Natalie Aw ◽  
Andrew Laine ◽  
Jonathan Posner

AbstractThe months between birth and age 2 are increasingly recognized as a period critical for neuro-development, with potentially life-long implications for cognitive functioning. However, little is known about the growth trajectories of brain structure and function across this time period. This is in large part because of insufficient approaches to analyze infant MRI scans at different months, especially brain segmentation. Addressing technical gaps in infant brain segmentation would significantly improve our capacity to efficiently measure and identify relevant infant brain structures and connectivity, and their role in long-term development. In this paper, we propose a transfer-learning approach based on convolutional neural network (CNN)-based image segmentation architecture, QuickNAT, to segment brain structures for newborns and 6-month infants separately. We pre-trained QuickNAT on auxiliary labels from a large-scale dataset, fine-tuned on manual labels, and then cross-validated the model’s performance on two separate datasets. Compared to other commonly used methods, our transfer-learning approach showed superior segmentation performance on both newborns and 6-month infants. Moreover, we demonstrated improved hippocampus segmentation performance via our approach in preterm infants.


Author(s):  
Rafał Kartaszyński ◽  
Paweł Mikołajczak
Keyword(s):  

2019 ◽  
Author(s):  
Qi Yuan ◽  
Alejandro Santana-Bonilla ◽  
Martijn Zwijnenburg ◽  
Kim Jelfs

<p>The chemical space for novel electronic donor-acceptor oligomers with targeted properties was explored using deep generative models and transfer learning. A General Recurrent Neural Network model was trained from the ChEMBL database to generate chemically valid SMILES strings. The parameters of the General Recurrent Neural Network were fine-tuned via transfer learning using the electronic donor-acceptor database from the Computational Material Repository to generate novel donor-acceptor oligomers. Six different transfer learning models were developed with different subsets of the donor-acceptor database as training sets. We concluded that electronic properties such as HOMO-LUMO gaps and dipole moments of the training sets can be learned using the SMILES representation with deep generative models, and that the chemical space of the training sets can be efficiently explored. This approach identified approximately 1700 new molecules that have promising electronic properties (HOMO-LUMO gap <2 eV and dipole moment <2 Debye), 6-times more than in the original database. Amongst the molecular transformations, the deep generative model has learned how to produce novel molecules by trading off between selected atomic substitutions (such as halogenation or methylation) and molecular features such as the spatial extension of the oligomer. The method can be extended as a plausible source of new chemical combinations to effectively explore the chemical space for targeted properties.</p>


2014 ◽  
Author(s):  
Hiroshi Kanayama ◽  
Youngja Park ◽  
Yuta Tsuboi ◽  
Dongmook Yi
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document