scholarly journals Text Generation using Neural Models

The use of automatically generated summaries for long/short texts is commonly used in digital services. In this Paper, a successful approach at text generation using generative adversarial networks (GAN) has been studied. In this paper, we have studied various neural models for text generation. Our main focus was on generating text using Recurrent Neural Network (RNN) and its variants and analyze its result. We have generated and translated text varying number of epochs and temperature to improve the confidence of the model as well as by varying the size of input file. We were amazed to see how the Long-Short Term Memory (LSTM) model responded to these varying parameters. The performance of LSTMs was better when the appropriate size of dataset was given to the model for training. The resulting model is tested on different datasets originating of varying sizes. The evaluations show that the output generated by the model do not correlate with the corresponding datasets which means that the generated output is different from the dataset.

Sensors ◽  
2020 ◽  
Vol 20 (13) ◽  
pp. 3738
Author(s):  
Zijian Niu ◽  
Ke Yu ◽  
Xiaofei Wu

Time series anomaly detection is widely used to monitor the equipment sates through the data collected in the form of time series. At present, the deep learning method based on generative adversarial networks (GAN) has emerged for time series anomaly detection. However, this method needs to find the best mapping from real-time space to the latent space at the anomaly detection stage, which brings new errors and takes a long time. In this paper, we propose a long short-term memory-based variational autoencoder generation adversarial networks (LSTM-based VAE-GAN) method for time series anomaly detection, which effectively solves the above problems. Our method jointly trains the encoder, the generator and the discriminator to take advantage of the mapping ability of the encoder and the discrimination ability of the discriminator simultaneously. The long short-term memory (LSTM) networks are used as the encoder, the generator and the discriminator. At the anomaly detection stage, anomalies are detected based on reconstruction difference and discrimination results. Experimental results show that the proposed method can quickly and accurately detect anomalies.


2020 ◽  
Vol 9 (1) ◽  
pp. 2049-2052

Generating handwritings of different kinds is quite a challenging task, an area in which not much work has been done yet. Though there has been substantial research done in the area of text recognition, the opposite of handwriting generation. Handwriting generation can prove to be extremely useful for children from blind schools where their speech can get converted into text and be used to generate handwritings of different kinds for them. Handwriting generation also has an important role in field of captcha generation. Our study exhibits in what way recurrent neural networks (RNN) of the type Long Short Term Memory (LSTM) could be used in order to create a composite sequence with structure covering a long range. We propose to use that the Generative Adversarial Network algorithm can be used to generate more realistic handwriting styles with better accuracy than other algorithms. Here, we will be trying to predict one point of data at a time. Our approach is shownfor text, where the type of data is discrete. It can also be used for online handwriting, that is real-valued data. It will then be further drawn out to handwriting generation. The created network will be conditioning its predictions based on a sequence of text. We will be using the resulting system to generate highly realistic cursive handwriting in a wide variety of styles. Experiments that have been carried out on online handwriting databases that are public predict that the method that has been proposed can be used to achieve satisfactory performance, the resultant writing samples achieved a high level of similarity with original samples of handwriting.


2020 ◽  
Vol 12 (2) ◽  
pp. 84-99
Author(s):  
Li-Pang Chen

In this paper, we investigate analysis and prediction of the time-dependent data. We focus our attention on four different stocks are selected from Yahoo Finance historical database. To build up models and predict the future stock price, we consider three different machine learning techniques including Long Short-Term Memory (LSTM), Convolutional Neural Networks (CNN) and Support Vector Regression (SVR). By treating close price, open price, daily low, daily high, adjusted close price, and volume of trades as predictors in machine learning methods, it can be shown that the prediction accuracy is improved.


2020 ◽  
Author(s):  
Abdolreza Nazemi ◽  
Johannes Jakubik ◽  
Andreas Geyer-Schulz ◽  
Frank J. Fabozzi

Sign in / Sign up

Export Citation Format

Share Document