stochastic em
Recently Published Documents


TOTAL DOCUMENTS

65
(FIVE YEARS 6)

H-INDEX

9
(FIVE YEARS 0)



2021 ◽  
Author(s):  
Zeljko Kereta ◽  
Robert Twyman ◽  
Simon R Arridge ◽  
Kris Thielemans ◽  
Bangti Jin


2021 ◽  
Author(s):  
Belhal Karimi ◽  
Ping Li
Keyword(s):  


2020 ◽  
Vol 64 (1) ◽  
pp. 33-62
Author(s):  
Dina M. Sabry ◽  
Ahmed M. Gad ◽  
Ramadan H. Mohamed




2020 ◽  
Vol 99 ◽  
pp. 102671
Author(s):  
Shaho Zarei ◽  
Adel Mohammadpour


2020 ◽  
Author(s):  
Kevin Yang ◽  
Wengong Jin ◽  
Kyle Swanson ◽  
Regina Barzilay ◽  
Tommi S Jaakkola

Generative models in molecular design tend to be richly parameterized, data-hungry neural models, as they must create complex structured objects as outputs. Estimating such models from data may be challenging due to the lack of sufficient training data. In this paper, we propose a surprisingly effective self-training approach for iteratively creating additional molecular targets. We first pre-train the generative model together with a simple property predictor. The property predictor is then used as a likelihood model for filtering candidate structures from the generative model. Additional targets are iteratively produced and used in the course of stochastic EM iterations to maximize the log-likelihood that the candidate structures are accepted. A simple rejection (re-weighting) sampler suffices to draw posterior samples since the generative model is already reasonable after pre-training. We demonstrate significant gains over strong baselines for both unconditional and conditional molecular design. In particular, our approach outperforms the previous state-of-the-art in conditional molecular design by over 10% in absolute gain.



Sign in / Sign up

Export Citation Format

Share Document