Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM

2002 ◽  
Vol 14 (9) ◽  
pp. 2039-2041 ◽  
Author(s):  
J. Schmidhuber ◽  
F. Gers ◽  
D. Eck

In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.

2005 ◽  
Vol 14 (01n02) ◽  
pp. 329-342 ◽  
Author(s):  
JUDY A. FRANKLIN ◽  
KRYSTAL K. LOCKE

We present results from experiments in using several pitch representations for jazz-oriented musical tasks performed by a recurrent neural network. We have run experiments with several kinds of recurrent networks for this purpose, and have found that Long Short-term Memory networks provide the best results. We show that a new pitch representation called Circles of Thirds works as well as two other published representations for these tasks, yet it is more succinct and enables faster learning. We then discuss limited results using other types of networks on the same tasks.


2020 ◽  
Author(s):  
Abdolreza Nazemi ◽  
Johannes Jakubik ◽  
Andreas Geyer-Schulz ◽  
Frank J. Fabozzi

Sign in / Sign up

Export Citation Format

Share Document