scholarly journals Representation Similarity Analysis for Efficient Task Taxonomy & Transfer Learning

Author(s):  
Kshitij Dwivedi ◽  
Gemma Roig
2019 ◽  
Author(s):  
Qi Yuan ◽  
Alejandro Santana-Bonilla ◽  
Martijn Zwijnenburg ◽  
Kim Jelfs

<p>The chemical space for novel electronic donor-acceptor oligomers with targeted properties was explored using deep generative models and transfer learning. A General Recurrent Neural Network model was trained from the ChEMBL database to generate chemically valid SMILES strings. The parameters of the General Recurrent Neural Network were fine-tuned via transfer learning using the electronic donor-acceptor database from the Computational Material Repository to generate novel donor-acceptor oligomers. Six different transfer learning models were developed with different subsets of the donor-acceptor database as training sets. We concluded that electronic properties such as HOMO-LUMO gaps and dipole moments of the training sets can be learned using the SMILES representation with deep generative models, and that the chemical space of the training sets can be efficiently explored. This approach identified approximately 1700 new molecules that have promising electronic properties (HOMO-LUMO gap <2 eV and dipole moment <2 Debye), 6-times more than in the original database. Amongst the molecular transformations, the deep generative model has learned how to produce novel molecules by trading off between selected atomic substitutions (such as halogenation or methylation) and molecular features such as the spatial extension of the oligomer. The method can be extended as a plausible source of new chemical combinations to effectively explore the chemical space for targeted properties.</p>


2014 ◽  
Author(s):  
Hiroshi Kanayama ◽  
Youngja Park ◽  
Yuta Tsuboi ◽  
Dongmook Yi
Keyword(s):  

2020 ◽  
Author(s):  
Susan L. Benear ◽  
Elizabeth A. Horwath ◽  
Emily Cowan ◽  
M. Catalina Camacho ◽  
Chi Ngo ◽  
...  

The medial temporal lobe (MTL) undergoes critical developmental change throughout childhood, which aligns with developmental changes in episodic memory. We used representational similarity analysis to compare neural pattern similarity for children and adults in hippocampus and parahippocampal cortex during naturalistic viewing of clips from the same movie or different movies. Some movies were more familiar to participants than others. Neural pattern similarity was generally lower for clips from the same movie, indicating that related content taxes pattern separation-like processes. However, children showed this effect only for movies with which they were familiar, whereas adults showed the effect consistently. These data suggest that children need more exposures to stimuli in order to show mature pattern separation processes.


2020 ◽  
Author(s):  
Miriam E. Weaverdyck ◽  
Mark Allen Thornton ◽  
Diana Tamir

Each individual experiences mental states in their own idiosyncratic way, yet perceivers are able to accurately understand a huge variety of states across unique individuals. How do they accomplish this feat? Do people think about their own anger in the same ways as another person’s? Is reading about someone’s anxiety the same as seeing it? Here, we test the hypothesis that a common conceptual core unites mental state representations across contexts. Across three studies, participants judged the mental states of multiple targets, including a generic other, the self, a socially close other, and a socially distant other. Participants viewed mental state stimuli in multiple modalities, including written scenarios and images. Using representational similarity analysis, we found that brain regions associated with social cognition expressed stable neural representations of mental states across both targets and modalities. This suggests that people use stable models of mental states across different people and contexts.


2020 ◽  
Author(s):  
Pathikkumar Patel ◽  
Bhargav Lad ◽  
Jinan Fiaidhi

During the last few years, RNN models have been extensively used and they have proven to be better for sequence and text data. RNNs have achieved state-of-the-art performance levels in several applications such as text classification, sequence to sequence modelling and time series forecasting. In this article we will review different Machine Learning and Deep Learning based approaches for text data and look at the results obtained from these methods. This work also explores the use of transfer learning in NLP and how it affects the performance of models on a specific application of sentiment analysis.


1995 ◽  
Author(s):  
Charles E. Lance ◽  
David L. Mayfield ◽  
Michael J. Kavanagh ◽  
R. B. Gould

Sign in / Sign up

Export Citation Format

Share Document