scholarly journals Holographic Image Reconstruction with Phase Recovery and Autofocusing Using Recurrent Neural Networks

ACS Photonics ◽  
2021 ◽  
Author(s):  
Luzhe Huang ◽  
Tairan Liu ◽  
Xilin Yang ◽  
Yi Luo ◽  
Yair Rivenson ◽  
...  
2017 ◽  
Vol 7 (2) ◽  
pp. 17141-17141 ◽  
Author(s):  
Yair Rivenson ◽  
Yibo Zhang ◽  
Harun Günaydın ◽  
Da Teng ◽  
Aydogan Ozcan

Author(s):  
SVEN BEHNKE

Successful image reconstruction requires the recognition of a scene and the generation of a clean image of that scene. We propose to use recurrent neural networks for both analysis and synthesis. The networks have a hierarchical architecture that represents images in multiple scales with different degrees of abstraction. The mapping between these representations is mediated by a local connection structure. We supply the networks with degraded images and train them to reconstruct the originals iteratively. This iterative reconstruction makes it possible to use partial results as context information to resolve ambiguities. We demonstrate the power of the approach using three examples: superresolution, fill-in of occluded parts, and noise removal/contrast enhancement. We also reconstruct images from sequences of degraded images.


2019 ◽  
Vol 38 (1) ◽  
pp. 280-290 ◽  
Author(s):  
Chen Qin ◽  
Jo Schlemper ◽  
Jose Caballero ◽  
Anthony N. Price ◽  
Joseph V. Hajnal ◽  
...  

2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


Sign in / Sign up

Export Citation Format

Share Document