scholarly journals Big Data Analytics Using Swarm-Based Long Short-Term Memory for Temperature Forecasting

2022 ◽  
Vol 71 (2) ◽  
pp. 2347-2361
Author(s):  
Malini M. Patil ◽  
P. M. Rekha ◽  
Arun Solanki ◽  
Anand Nayyar ◽  
Basit Qureshi
Symmetry ◽  
2018 ◽  
Vol 10 (10) ◽  
pp. 485 ◽  
Author(s):  
Muhammad Ashfaq Khan ◽  
Md. Rezaul Karim ◽  
Yangwoo Kim

Every day we experience unprecedented data growth from numerous sources, which contribute to big data in terms of volume, velocity, and variability. These datasets again impose great challenges to analytics framework and computational resources, making the overall analysis difficult for extracting meaningful information in a timely manner. Thus, to harness these kinds of challenges, developing an efficient big data analytics framework is an important research topic. Consequently, to address these challenges by exploiting non-linear relationships from very large and high-dimensional datasets, machine learning (ML) and deep learning (DL) algorithms are being used in analytics frameworks. Apache Spark has been in use as the fastest big data processing arsenal, which helps to solve iterative ML tasks, using distributed ML library called Spark MLlib. Considering real-world research problems, DL architectures such as Long Short-Term Memory (LSTM) is an effective approach to overcoming practical issues such as reduced accuracy, long-term sequence dependency, and vanishing and exploding gradient in conventional deep architectures. In this paper, we propose an efficient analytics framework, which is technically a progressive machine learning technique merged with Spark-based linear models, Multilayer Perceptron (MLP) and LSTM, using a two-stage cascade structure in order to enhance the predictive accuracy. Our proposed architecture enables us to organize big data analytics in a scalable and efficient way. To show the effectiveness of our framework, we applied the cascading structure to two different real-life datasets to solve a multiclass and a binary classification problem, respectively. Experimental results show that our analytical framework outperforms state-of-the-art approaches with a high-level of classification accuracy.


With the developing utilization of data innovation in all life areas, hacking has turned out to be more contrarily powerful than any other time in recent memory. Additionally, with creating advances, assaults numbers are developing exponentially like clockwork and become progressively refined so conventional I.D.S ends up wasteful recognizing them. We accomplish those outcomes by utilizing Networking Chabot, a profound intermittent neural system: Long Short Term Memory (L.S.T.M) [2]over Apache Spark Framework that has a contribution of stream traffic and traffic conglomeration and the yield is a language of two words, typical or strange. The new and proposed blending ideas of the language are preparing, relevant examination, circulated profound adapting, huge information, and oddity discovery of stream investigation. We propose a model that portrays the system dynamic typical conduct from an arrangement of a great many parcels inside their unique circumstance and examines them in close to constant to identify point, aggregate and relevant inconsistencies. The examination shows lower false positive, higher identification rate and better point abnormalities location. With respect to demonstrate of relevant and aggregate oddities identification, we talk about our case and the explanation for our speculation. Be that as it may, the investigation is done on arbitrary little subsets of the dataset as a result of equipment restrictions, so we offer examination and our future vision musings as we wish that full demonstrate will be done in future by other intrigued specialists who have preferable equipment foundation over our own..


Entropy ◽  
2019 ◽  
Vol 22 (1) ◽  
pp. 10 ◽  
Author(s):  
Rabiya Khalid ◽  
Nadeem Javaid ◽  
Fahad A. Al-zahrani ◽  
Khursheed Aurangzeb ◽  
Emad-ul-Haq Qazi ◽  
...  

In the smart grid (SG) environment, consumers are enabled to alter electricity consumption patterns in response to electricity prices and incentives. This results in prices that may differ from the initial price pattern. Electricity price and demand forecasting play a vital role in the reliability and sustainability of SG. Forecasting using big data has become a new hot research topic as a massive amount of data is being generated and stored in the SG environment. Electricity users, having advanced knowledge of prices and demand of electricity, can manage their load efficiently. In this paper, a recurrent neural network (RNN), long short term memory (LSTM), is used for electricity price and demand forecasting using big data. Researchers are working actively to propose new models of forecasting. These models contain a single input variable as well as multiple variables. From the literature, we observed that the use of multiple variables enhances the forecasting accuracy. Hence, our proposed model uses multiple variables as input and forecasts the future values of electricity demand and price. The hyperparameters of this algorithm are tuned using the Jaya optimization algorithm to improve the forecasting ability and increase the training mechanism of the model. Parameter tuning is necessary because the performance of a forecasting model depends on the values of these parameters. Selection of inappropriate values can result in inaccurate forecasting. So, integration of an optimization method improves the forecasting accuracy with minimum user efforts. For efficient forecasting, data is preprocessed and cleaned from missing values and outliers, using the z-score method. Furthermore, data is normalized before forecasting. The forecasting accuracy of the proposed model is evaluated using the root mean square error (RMSE) and mean absolute error (MAE). For a fair comparison, the proposed forecasting model is compared with univariate LSTM and support vector machine (SVM). The values of the performance metrics depict that the proposed model has higher accuracy than SVM and univariate LSTM.


Circulation ◽  
2020 ◽  
Vol 141 (Suppl_1) ◽  
Author(s):  
Qingxue Zhang

Smart health technologies are bringing exciting possibilities to the cardiac healthcare area. Wearable electrocardiogram (ECG) monitoring is expected to establish cardiac big data towards precision cardiac health. However, there are two key obstacles here. Firstly, how to conveniently measure the standard 12-lead ECG in our daily lives is an open question, since the traditional 12-lead ECG is mainly used in clinics or hospitals. The Holter ECG monitor is actually not convenient and comfortable enough for daily and long-term use. The Apple Watch only provides finger-touch-based single lead ECG measurement, neither supporting 12-lead ECG nor continuous tracking. In this study, a long short-term memory neural network-based ECG monitoring system is proposed, which can generate the remaining 9-lead ECG from only 3-lead ECG, offering a very high wearabilty, usability and convenience. Secondly, how to maintain a high ECG quality even when the user has different physical activities is another critical challenge. Usually, the ECG morphology may be contaminated by diverse motions artifacts induced by sensor-to-skin contact variations. This has to be addressed to guarantee the obtained ECG is usable and interpretable. We have introduced bidirectional long short-term memory to deal with these noisy fluctuations, by learning the temporal consistent dynamics among 3-lead ECG. The system has been evaluated on ten human subjects to demonstrate the effectiveness. Compared with the ground truth, the reconstructed 12-lead ECG has a correlation as high as 0.88 and a root mean square error of 0.059 mV, far superior to the traditional linear regression method. The proposed novel monitor is expected to greatly advance precision cardiac health.


Sign in / Sign up

Export Citation Format

Share Document