scholarly journals Dynamic access control and security performance prediction for IoT networking using a novel deep learning technique

Author(s):  
Sriramya P. ◽  
A.K. Reshmy ◽  
R. Subhashini ◽  
Korakod Tongkachok ◽  
Ajay Prakash Pasupulla ◽  
...  

Abstract Internet of things (IoT) has increased an importance for an area of interest in many devices. Then, the applications such as sensitive home sensors, medical devices, wireless sensors,and other devices are related to IoT network. The transmission of big data is subject to a possible attack that could cause network interruptions and problems with security. The security performance prediction is important for IoT networks to address complicated security issues in real-time which one of the attacks can freely threaten its global performance. Initially,investigate the safety performance of security intelligent prediction techniques is linking with deep learning algorithms into the IoT security risks. This contribution provides a CNN model that improves IoT security risk assessment (SRA) performance. Then, the access control techniques are changed with IoT-like dynamic systems with the number of items spread all over the place. Therefore, dynamic access control models are necessary. Thesedesign not individual use strategies of access but incorporate environmental and real-time data to predict the decision on access. The risk-based access control approach is one of those dynamic models. To decide the access decision, this model assesses the security risk value associated with the access request. This assessment of the model proposed results from the performance and accuracy of IoT networks.

Processes ◽  
2020 ◽  
Vol 8 (6) ◽  
pp. 649
Author(s):  
Yifeng Liu ◽  
Wei Zhang ◽  
Wenhao Du

Deep learning based on a large number of high-quality data plays an important role in many industries. However, deep learning is hard to directly embed in the real-time system, because the data accumulation of the system depends on real-time acquisitions. However, the analysis tasks of such systems need to be carried out in real time, which makes it impossible to complete the analysis tasks by accumulating data for a long time. In order to solve the problems of high-quality data accumulation, high timeliness of the data analysis, and difficulty in embedding deep-learning algorithms directly in real-time systems, this paper proposes a new progressive deep-learning framework and conducts experiments on image recognition. The experimental results show that the proposed framework is effective and performs well and can reach a conclusion similar to the deep-learning framework based on large-scale data.


Author(s):  
Giovanni Capobianco ◽  
Umberto Di Giacomo ◽  
Tommaso Di Tusa ◽  
Francesco Mercaldo ◽  
Antonella Santone

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Shihong Dang ◽  
Wei Tang

The traditional real-time data scheduling method ignores the optimization process of job data that leads to delayed delivery, high inventory cost, and low utilization rate of equipment. This paper proposes a novel real-time data scheduling method based on deep learning and an improved fuzzy algorithm for flexible operations in the papermaking workshop. The algorithm is divided into three parts: the first part describes the flexible job shop scheduling problem; the second part constructs the fuzzy scheduling model of flexible job data in papermaking workshop; and finally the third part uses a genetic algorithm to obtain the optimal solution of fuzzy scheduling of flexible job data in papermaking workshop. The results show that the optimal solution is obtained in 48 seconds at the 23rd attempt (iteration) under the application of the proposed method. This result is much better than the three traditional scheduling methods with which we compared our results. Hence, this paper improves the work efficiency and quality of papermaking workshop and reduces the operating cost of the papermaking enterprise.


2021 ◽  
Vol 2114 (1) ◽  
pp. 012067
Author(s):  
Ruba R. Nori ◽  
Rabah N. Farhan ◽  
Safaa Hussein Abed

Abstract Novel algorithm for fire detection has been introduced. CNN based System localization of fire for real time applications was proposed. Deep learning algorithms shows excellent results in a way that it accuracy reaches very high accuracy for fire image dataset. Yolo is a superior deep learning algorithm that is capable of detect and localize fires in real time. The luck of image dataset force us to limit the system in binary classification test. Proposed model was tested on dataset gathered from the internet. In this article, we built an automated alert system integrating multiple sensors and state-of-the art deep learning algorithms, which have a limited number of false positive elements and which provide our prototype robot with reasonable accuracy in real-time data and as little as possible to track and record fire events as soon as possible.


2021 ◽  
Author(s):  
Lei Sun ◽  
Tianyuan Liu ◽  
Yonghui Xie ◽  
Xinlei Xia

Abstract Accurate and real-time parameters forecasting is of great importance to the turbine control and predictive maintenance which can help the improvement of power system. In this study, deep-learning models including recurrent neural network (RNN) and convolutional neural network (CNN) for multi-parameter prediction are proposed, and are applied to predict real-time parameters of steam turbine based on data from a power plant. Firstly, the prediction results of RNN and CNN models are compared by the overall performance. The two models show good performance on forecasting of six state parameters while RNN performs better. Moreover, the detailed performance on a certain day show that the relative error of two models are both less than 2%. Finally, the influence of model designs including loss function, training size and input time-steps on the performance of RNN model are also explored. The effects of the above parameters on the prediction performance, training and prediction time of the models are studied. The results can provide a reference for model deployment in the power plant. It is convinced that the proposed method has a high potential for dynamic process prediction in actual industrial scenarios through the above research.


Symmetry ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 1
Author(s):  
Jie Zhu ◽  
Weixiang Xu

In order to enhance the real-time and retrieval performance of road traffic data filling, a real-time data filling and automatic retrieval algorithm based on the deep-learning method is proposed. In image detection, the depth representation is extracted according to the detection target area of a general object. The local invariant feature is extracted to describe local attributes in the region, and it is fused with depth representation to complete the real-time data filling of road traffic. According to the results of the database enhancement, the retrieval results of the deep representation level are reordered. In the index stage, unsupervised feature updating is realized by neighborhood information to improve the performance of a feature retrieval. The experimental results show that the proposed method has high recall and precision, a short retrieval time and a low running cost.


2021 ◽  
Author(s):  
Menaa Nawaz ◽  
Jameel Ahmed

Abstract Physiological signals retrieve the information from sensors implanted or attached to the human body. These signals are vital data sources that can assist in predicting the disease well before time and thus proper treatment can be made possible. With the addition of Internet of Things in healthcare, real-time data collection and pre-processing for signal analysis has reduced burden of in-person appointments and decision making on healthcare. Recently, Deep learning-based algorithms have been implemented by researchers for recognition, realization and prediction of diseases by extracting and analyzing the important features. In this research real-time 1-D timeseries data of on-body non-invasive bio-medical sensors have been acquired and pre-processed and analyzed for anomaly detection. Feature engineered parameters of large and diverse dataset have been used to train the data to make the anomaly detection system more reliable. For comprehensive real-time monitoring the implemented system uses wavelet time scattering features for classification and deep learning based autoencoder for anomaly detection of time series signals for assisting the clinical diagnosis of cardiovascular and muscular activity. In this research, an implementation of IoT based healthcare system using bio-medical sensors has been presented. This paper also aims to provide the analysis of cloud data acquired through bio-medical sensors using signal analysis techniques for anomaly detection and timeseries classification has been done for the disease prognosis in real-time. Wavelet time scattering based signals classification accuracy of 99.88% is achieved. In real time signals anomaly detection, 98% accuracy is achieved. The average Mean Absolute Error loss of 0.0072 for normal signals and 0.078 is achieved for anomaly signals.


We have real-time data everywhere and every day. Most of the data comes from IoT sensors, data from GPS positions, web transactions and social media updates. Real time data is typically generated in a continuous fashion. Such real-time data are called Data streams. Data streams are transient and there is very little time to process each item in the stream. It is a great challenge to do analytics on rapidly flowing high velocity data. Another issue is the percentage of incoming data that is considered for analytics. Higher the percentage greater would be the accuracy. Considering these two issues, the proposed work is intended to find a better solution by gaining insight on real-time streaming data with minimum response time and greater accuracy. This paper combines the two technology giants TensorFlow and Apache Kafka. is used to handle the real-time streaming data since TensorFlow supports analytics support with deep learning algorithms. The Training and Testing is done on Uber connected vehicle public data set RideAustin. The experimental result of RideAustin shows the predicted failure under each type of vehicle parameter. The comparative analysis showed 16% improvement over the traditional Machine Learning algorithm.


Sign in / Sign up

Export Citation Format

Share Document