Cascade convolutional neural network‐long short‐term memory recurrent neural networks for automatic tonal and nontonal preclassification‐based Indian language identification

2020 ◽  
Vol 37 (5) ◽  
Author(s):  
Chuya China Bhanja ◽  
Mohammad A. Laskar ◽  
Rabul H. Laskar
2020 ◽  
Vol 34 (04) ◽  
pp. 4989-4996
Author(s):  
Ekaterina Lobacheva ◽  
Nadezhda Chirkova ◽  
Alexander Markovich ◽  
Dmitry Vetrov

One of the most popular approaches for neural network compression is sparsification — learning sparse weight matrices. In structured sparsification, weights are set to zero by groups corresponding to structure units, e. g. neurons. We further develop the structured sparsification approach for the gated recurrent neural networks, e. g. Long Short-Term Memory (LSTM). Specifically, in addition to the sparsification of individual weights and neurons, we propose sparsifying the preactivations of gates. This makes some gates constant and simplifies an LSTM structure. We test our approach on the text classification and language modeling tasks. Our method improves the neuron-wise compression of the model in most of the tasks. We also observe that the resulting structure of gate sparsity depends on the task and connect the learned structures to the specifics of the particular tasks.


Author(s):  
Javier Gonzalez-Dominguez ◽  
Ignacio Lopez-Moreno ◽  
Haşim Sak ◽  
Joaquin Gonzalez-Rodriguez ◽  
Pedro J. Moreno

2021 ◽  
Vol 7 (2) ◽  
pp. 113-121
Author(s):  
Firman Pradana Rachman

Setiap orang mempunyai pendapat atau opini terhadap suatu produk, tokoh masyarakat, atau pun sebuah kebijakan pemerintah yang tersebar di media sosial. Pengolahan data opini itu di sebut dengan sentiment analysis. Dalam pengolahan data opini yang besar tersebut tidak hanya cukup menggunakan machine learning, namun bisa juga menggunakan deep learning yang di kombinasikan dengan teknik NLP (Natural Languange Processing). Penelitian ini membandingkan beberapa model deep learning seperti CNN (Convolutional Neural Network), RNN (Recurrent Neural Networks), LSTM (Long Short-Term Memory) dan beberapa variannya untuk mengolah data sentiment analysis dari review produk amazon dan yelp.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Hangxia Zhou ◽  
Qian Liu ◽  
Ke Yan ◽  
Yang Du

Short-term photovoltaic (PV) energy generation forecasting models are important, stabilizing the power integration between the PV and the smart grid for artificial intelligence- (AI-) driven internet of things (IoT) modeling of smart cities. With the recent development of AI and IoT technologies, it is possible for deep learning techniques to achieve more accurate energy generation forecasting results for the PV systems. Difficulties exist for the traditional PV energy generation forecasting method considering external feature variables, such as the seasonality. In this study, we propose a hybrid deep learning method that combines the clustering techniques, convolutional neural network (CNN), long short-term memory (LSTM), and attention mechanism with the wireless sensor network to overcome the existing difficulties of the PV energy generation forecasting problem. The overall proposed method is divided into three stages, namely, clustering, training, and forecasting. In the clustering stage, correlation analysis and self-organizing mapping are employed to select the highest relevant factors in historical data. In the training stage, a convolutional neural network, long short-term memory neural network, and attention mechanism are combined to construct a hybrid deep learning model to perform the forecasting task. In the testing stage, the most appropriate training model is selected based on the month of the testing data. The experimental results showed significantly higher prediction accuracy rates for all time intervals compared to existing methods, including traditional artificial neural networks, long short-term memory neural networks, and an algorithm combining long short-term memory neural network and attention mechanism.


2005 ◽  
Vol 14 (01n02) ◽  
pp. 329-342 ◽  
Author(s):  
JUDY A. FRANKLIN ◽  
KRYSTAL K. LOCKE

We present results from experiments in using several pitch representations for jazz-oriented musical tasks performed by a recurrent neural network. We have run experiments with several kinds of recurrent networks for this purpose, and have found that Long Short-term Memory networks provide the best results. We show that a new pitch representation called Circles of Thirds works as well as two other published representations for these tasks, yet it is more succinct and enables faster learning. We then discuss limited results using other types of networks on the same tasks.


PLoS ONE ◽  
2016 ◽  
Vol 11 (1) ◽  
pp. e0146917 ◽  
Author(s):  
Ruben Zazo ◽  
Alicia Lozano-Diez ◽  
Javier Gonzalez-Dominguez ◽  
Doroteo T. Toledano ◽  
Joaquin Gonzalez-Rodriguez

Sign in / Sign up

Export Citation Format

Share Document