scholarly journals A Dense Long Short-Term Memory Model for Enhancing the Imagery-Based Brain-Computer Interface

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Xiaofei Zhang ◽  
Tao Wang ◽  
Qi Xiong ◽  
Yina Guo

Imagery-based brain-computer interfaces (BCIs) aim to decode different neural activities into control signals by identifying and classifying various natural commands from electroencephalogram (EEG) patterns and then control corresponding equipment. However, several traditional BCI recognition algorithms have the “one person, one model” issue, where the convergence of the recognition model’s training process is complicated. In this study, a new BCI model with a Dense long short-term memory (Dense-LSTM) algorithm is proposed, which combines the event-related desynchronization (ERD) and the event-related synchronization (ERS) of the imagery-based BCI; model training and testing were conducted with its own data set. Furthermore, a new experimental platform was built to decode the neural activity of different subjects in a static state. Experimental evaluation of the proposed recognition algorithm presents an accuracy of 91.56%, which resolves the “one person one model” issue along with the difficulty of convergence in the training process.

2021 ◽  
Vol 3 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

The lack of high-quality continental-scale groundwater table depth observations necessitates developing an indirect method to produce reliable estimation for water table depth anomalies (wtda) over Europe to facilitate European groundwater management under drought conditions. Long Short-Term Memory (LSTM) networks are a deep learning technology to exploit long-short-term dependencies in the input-output relationship, which have been observed in the response of groundwater dynamics to atmospheric and land surface processes. Here, we introduced different input variables including precipitation anomalies (pra), which is the most common proxy of wtda, for the networks to arrive at improved wtda estimates at individual pixels over Europe in various experiments. All input and target data involved in this study were obtained from the simulated TSMP-G2A data set. We performed wavelet coherence analysis to gain a comprehensive understanding of the contributions of different input variable combinations to wtda estimates. Based on the different experiments, we derived an indirect method utilizing LSTM networks with pra and soil moisture anomaly (θa) as input, which achieved the optimal network performance. The regional medians of test R2 scores and RMSEs obtained by the method in the areas with wtd ≤ 3.0 m were 76–95% and 0.17–0.30, respectively, constituting a 20–66% increase in median R2 and a 0.19–0.30 decrease in median RMSEs compared to the LSTM networks only with pra as input. Our results show that introducing θa significantly improved the performance of the trained networks to predict wtda, indicating the substantial contribution of θa to explain groundwater anomalies. Also, the European wtda map reproduced by the method had good agreement with that derived from the TSMP-G2A data set with respect to drought severity, successfully detecting ~41% of strong drought events (wtda ≥ 1.5) and ~29% of extreme drought events (wtda ≥ 2) in August 2015. The study emphasizes the importance to combine soil moisture information with precipitation information in quantifying or predicting groundwater anomalies. In the future, the indirect method derived in this study can be transferred to real-time monitoring of groundwater drought at the continental scale using remotely sensed soil moisture and precipitation observations or respective information from weather prediction models.


2021 ◽  
Vol 17 (12) ◽  
pp. 155014772110612
Author(s):  
Zhengqiang Ge ◽  
Xinyu Liu ◽  
Qiang Li ◽  
Yu Li ◽  
Dong Guo

To significantly protect the user’s privacy and prevent the user’s preference disclosure from leading to malicious entrapment, we present a combination of the recommendation algorithm and the privacy protection mechanism. In this article, we present a privacy recommendation algorithm, PrivItem2Vec, and the concept of the recommended-internet of things, which is a privacy recommendation algorithm, consisting of user’s information, devices, and items. Recommended-internet of things uses bidirectional long short-term memory, based on item2vec, which improves algorithm time series and the recommended accuracy. In addition, we reconstructed the data set in conjunction with the Paillier algorithm. The data on the server are encrypted and embedded, which reduces the readability of the data and ensures the data’s security to a certain extent. Experiments show that our algorithm is superior to other works in terms of recommended accuracy and efficiency.


Author(s):  
Dejiang Kong ◽  
Fei Wu

The widely use of positioning technology has made mining the movements of people feasible and plenty of trajectory data have been accumulated. How to efficiently leverage these data for location prediction has become an increasingly popular research topic as it is fundamental to location-based services (LBS). The existing methods often focus either on long time (days or months) visit prediction (i.e., the recommendation of point of interest) or on real time location prediction (i.e., trajectory prediction). In this paper, we are interested in the location prediction problem in a weak real time condition and aim to predict users' movement in next minutes or hours. We propose a Spatial-Temporal Long-Short Term Memory (ST-LSTM) model which naturally combines spatial-temporal influence into LSTM to mitigate the problem of data sparsity. Further, we employ a hierarchical extension of the proposed ST-LSTM (HST-LSTM) in an encoder-decoder manner which models the contextual historic visit information in order to boost the prediction performance. The proposed HST-LSTM is evaluated on a real world trajectory data set and the experimental results demonstrate the effectiveness of the proposed model.


2021 ◽  
Author(s):  
Jianrong Dai

Abstract Purpose Machine Performance Check (MPC) is a daily quality assurance (QA) tool for Varian machines. The daily QA data based on MPC tests show machine performance patterns and potentially provide warning messages for preventive actions. This study developed a neural network model that could predict the trend of data variations quantitively. Methods and materials: MPC data used were collected daily for 3 years. The stacked long short-term memory (LSTM)model was used to develop the neural work model. To compare the stacked LSTM, the autoregressive integrated moving average model (ARIMA) was developed on the same data set. Cubic interpolation was used to double the amount of data to enhance prediction accuracy. After then, the data were divided into 3 groups: 70% for training, 15% for validation, and 15% for testing. The training set and the validation set were used to train the stacked LSTM with different hyperparameters to find the optimal hyperparameter. Furthermore, a greedy coordinate descent method was employed to combinate different hyperparameter sets. The testing set was used to assess the performance of the model with the optimal hyperparameter combination. The accuracy of the model was quantified by the mean absolute error (MAE), root-mean-square error (RMSE), and coefficient of determination (R2). Results A total of 867 data were collected to predict the data for the next 5 days. The mean MAE, RMSE, and \({\text{R}}^{2}\) with all MPC tests was 0.013, 0.020, and 0.853 in LSTM, while 0.021, 0.030, and 0.618 in ARIMA, respectively. The results show that the LSTM outperforms the ARIMA. Conclusions In this study, the stacked LSTM model can accurately predict the daily QA data based on MPC tests. Predicting future performance data based on MPC tests will foresee possible machine failure, allowing early machine maintenance and reducing unscheduled machine downtime.


2019 ◽  
Vol 15 (10) ◽  
pp. 155014771988313 ◽  
Author(s):  
Chi Hua ◽  
Erxi Zhu ◽  
Liang Kuang ◽  
Dechang Pi

Accurate prediction of the generation capacity of photovoltaic systems is fundamental to ensuring the stability of the grid and to performing scheduling arrangements correctly. In view of the temporal defect and the local minimum problem of back-propagation neural network, a forecasting method of power generation based on long short-term memory-back-propagation is proposed. On this basis, the traditional prediction data set is improved. According to the three traditional methods listed in this article, we propose a fourth method to improve the traditional photovoltaic power station short-term power generation prediction. Compared with the traditional method, the long short-term memory-back-propagation neural network based on the improved data set has a lower prediction error. At the same time, a horizontal comparison with the multiple linear regression and the support vector machine shows that the long short-term memory-back-propagation method has several advantages. Based on the long short-term memory-back-propagation neural network, the short-term forecasting method proposed in this article for generating capacity of photovoltaic power stations will provide a basis for dispatching plan and optimizing operation of power grid.


Author(s):  
Mingqiang Lin ◽  
Denggao Wu ◽  
Gengfeng Zheng ◽  
Ji Wu

Lithium-ion batteries are widely used as the power source in electric vehicles. The state of health (SOH) diagnosis is very important for the safety and storage capacity of lithium-ion batteries. In order to accurately and robustly estimate lithium-ion battery SOH, a novel long short-term memory network (LSTM) based on the charging curve is proposed for SOH estimation in this work. Firstly, aging features that reflect the battery degradation phenomenon are extracted from the charging curves. Then, considering capture the long-term tendency of battery degradation, some improvements are made in the proposed LSTM model. The connection between the input gate and the output gate is added to better control output information of the memory cell. Meanwhile, the forget gate and input gate are coupled into a single update gate for selectively forgetting before the accumulation of information. To achieve more reliability and robustness of the SOH estimation method, the improved LSTM network is adaptively trained online by using a particle filter. Furthermore, to verify the effectiveness of the proposed method, we compare the proposed method with two data-driven methods on the public battery data set and the commercial battery data set. Experimental results demonstrate the proposed method can obtain the highest SOH accuracy.


2021 ◽  
pp. 016555152110239
Author(s):  
Wei Du ◽  
Guanran Jiang ◽  
Wei Xu ◽  
Jian Ma

With the rapid development of the patent marketplace, patent trading recommendation is required to mitigate the technology searching cost of patent buyers. Current research focuses on the recommendation based on existing patents of a company; a few studies take into account the sequential pattern of patent acquisition activities and the possible diversity of a company’s business interests. Moreover, the profiling of patents based on solely patent documents fails to capture the high-order information of patents. To bridge the gap, we propose a knowledge-aware attentional bidirectional long short-term memory network (KBiLSTM) method for patent trading recommendation. KBiLSTM uses knowledge graph embeddings to profile patents with rich patent information. It introduces bidirectional long short-term memory network (BiLSTM) to capture the sequential pattern in a company’s historical records. In addition, to address a company’s diverse technology interests, we design an attention mechanism to aggregate the company’s historical patents given a candidate patent. Experimental results on the United States Patent and Trademark Office (USPTO) data set show that KBiLSTM outperforms state-of-the-art baselines for patent trading recommendation in terms of F1 and normalised discounted cumulative gain (nDCG). The attention visualisation of randomly selected company intuitively demonstrates the recommendation effectiveness.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Mahdi Yousefzadeh Aghdam ◽  
Seyed Reza Kamel Tabbakh ◽  
Seyed Javad Mahdavi Chabok ◽  
Maryam Kheyrabadi

AbstractNowadays this concept has been widely assessed due to its complexity and sensitivity for the beneficiaries, including passengers, airlines, regulatory agencies, and other organizations. To date, various methods (e.g., statistical and fuzzy techniques) and data mining algorithms (e.g., neural network) have been used to solve the issues of air traffic management (ATM) and delay the minimization problems. However, each of these techniques has some disadvantages, such as overlooking the data, computational complexities, and uncertainty. In this paper, to increase the air traffic management accuracy and legitimacy we used the bidirectional long short-term memory (Bi-LSTMs) and extreme learning machines (ELM) to design the structure of a deep learning network method. The Kaggle data set and different performance parameters and statistical criteria have been used in MATLAB to validate the proposed method. Using the proposed method has improved the criteria factors of this study. The proposed method has had a % increase in air traffic management in comparison to other papers. Therefore, it can be said that the proposed method has a much higher air traffic management capacity in comparison to the previous methods.


Author(s):  
Mr. V. Manoj Kumar

Prediction is most important for stock market not only for traders but also for computer engineers who analyses stock data. We can perform this prediction by two ways one is using historical stock data and other by analyzing by information gathered from social media. It is based on model/pattern used to predict stock dataset, there are so many models are available for predicting stocks, simply model is algorithm that’s from machine learning and deep learning. In the data set the two main parameters open and close value are used for stock prediction mostly but we can also predict by its volume too. So that data is preprocessed before it is used for prediction. In this paper we used various algorithm like linear regression, support vector regression and long short-term memory for better accuracy and to compare how it different from other algorithm and for predicting future stock.


Sign in / Sign up

Export Citation Format

Share Document