On designing the adaptive computation framework of distributed deep learning models for Internet-of-Things applications

Author(s):  
Chia-Heng Tu ◽  
QiHui Sun ◽  
Mu-Hsuan Cheng
Author(s):  
S. Arokiaraj ◽  
Dr. N. Viswanathan

With the advent of Internet of things(IoT),HA (HA) recognition has contributed the more application in health care in terms of diagnosis and Clinical process. These devices must be aware of human movements to provide better aid in the clinical applications as well as user’s daily activity.Also , In addition to machine and deep learning algorithms, HA recognition systems has significantly improved in terms of high accurate recognition. However, the most of the existing models designed needs improvisation in terms of accuracy and computational overhead. In this research paper, we proposed a BAT optimized Long Short term Memory (BAT-LSTM) for an effective recognition of human activities using real time IoT systems. The data are collected by implanting the Internet of things) devices invasively. Then, proposed BAT-LSTM is deployed to extract the temporal features which are then used for classification to HA. Nearly 10,0000 dataset were collected and used for evaluating the proposed model. For the validation of proposed framework, accuracy, precision, recall, specificity and F1-score parameters are chosen and comparison is done with the other state-of-art deep learning models. The finding shows the proposed model outperforms the other learning models and finds its suitability for the HA recognition.


Author(s):  
Rishabh Verma ◽  
Latika Kharb

Smart farming through IoT technology could empower farmers to upgrade profitability going from the amount of manure to be used to the quantity of water for irrigating their fields and also help them to decrease waste. Through IoT, sensors could be used for assisting farmers in the harvest field to check for light, moistness, temperature, soil dampness, etc., and robotizing the water system framework. Moreover, the farmers can screen the field conditions from anyplace and overcome the burden and fatigue to visit farms to confront problems in the fields. For example, farmers are confronting inconvenience while utilizing right quantity and time to use manures and pesticides in their fields as per the crop types. In this chapter, the authors have introduced a model where farmers can classify damaged crops and healthy crops with the help of different sensors and deep learning models. (i.e., The idea of implementing IoT concepts for the benefit of farmers and moving the world towards smart agriculture is presented.)


Author(s):  
Diana Gaifilina ◽  
Igor Kotenko

Introduction: The article discusses the problem of choosing deep learning models for detecting anomalies in Internet of Things (IoT) network traffic. This problem is associated with the necessity to analyze a large number of security events in order to identify the abnormal behavior of smart devices. A powerful technology for analyzing such data is machine learning and, in particular, deep learning. Purpose: Development of recommendations for the selection of deep learning models for anomaly detection in IoT network traffic. Results: The main results of the research are comparative analysis of deep learning models, and recommendations on the use of deep learning models for anomaly detection in IoT network traffic. Multilayer perceptron, convolutional neural network, recurrent neural network, long short-term memory, gated recurrent units, and combined convolutional-recurrent neural network were considered the basic deep learning models. Additionally, the authors analyzed the following traditional machine learning models: naive Bayesian classifier, support vector machines, logistic regression, k-nearest neighbors, boosting, and random forest. The following metrics were used as indicators of anomaly detection efficiency: accuracy, precision, recall, and F-measure, as well as the time spent on training the model. The constructed models demonstrated a higher accuracy rate for anomaly detection in large heterogeneous traffic typical for IoT, as compared to conventional machine learning methods. The authors found that with an increase in the number of neural network layers, the completeness of detecting anomalous connections rises. This has a positive effect on the recognition of unknown anomalies, but increases the number of false positives. In some cases, preparing traditional machine learning models takes less time. This is due to the fact that the application of deep learning methods requires more resources and computing power. Practical relevance: The results obtained can be used to build systems for network anomaly detection in Internet of Things traffic.


2021 ◽  
Author(s):  
Kanimozhi V ◽  
T. Prem Jacob

Abstract Although there exist various strategies for IoT Intrusion Detection, this research article sheds light on the aspect of how the application of top 10 Artificial Intelligence - Deep Learning Models can be useful for both supervised and unsupervised learning related to the IoT network traffic data. It pictures the detailed comparative analysis for IoT Anomaly Detection on sensible IoT gadgets that are instrumental in detecting IoT anomalies by the usage of the latest dataset IoT-23. Many strategies are being developed for securing the IoT networks, but still, development can be mandated. IoT security can be improved by the usage of various deep learning methods. This exploration has examined the top 10 deep-learning techniques, as the realistic IoT-23 dataset for improving the security execution of IoT network traffic. We built up various neural network models for identifying 5 kinds of IoT attack classes such as Mirai, Denial of Service (DoS), Scan, Man in the Middle attack (MITM-ARP), and Normal records. These attacks can be detected by using a "softmax" function of multiclass classification in deep-learning neural network models. This research was implemented in the Anaconda3 environment with different packages such as Pandas, NumPy, Scipy, Scikit-learn, TensorFlow 2.2, Matplotlib, and Seaborn. The utilization of AI-deep learning models embraced various domains like healthcare, banking and finance, findings and scientific researches, and the business organizations along with the concepts like the Internet of Things. We found that the top 10 deep-learning models are capable of increasing the accuracy; minimize the loss functions and the execution time for building that specific model. It contributes a major significance to IoT anomaly detection by using emerging technologies Artificial Intelligence and Deep Learning Neural Networks. Hence the alleviation of assaults that happen on an IoT organization will be effective. Among the top 10 neural networks, Convolutional neural networks, Multilayer perceptron, and Generative Adversarial Networks (GANs) output the highest accuracy scores of 0.996317, 0.996157, and 0.995829 with minimized loss function and less time pertain to the execution. This article added to completely grasp the quirks of irregularity identification of IoT anomalies. Henceforth, this research analysis depicts the implementations of the Top 10 AI-deep learning models, which come in handy that assist you to perceive different neural network models and IoT anomaly detection better.


Author(s):  
Dr. Subarna Shakya ◽  
Dr. Smys S.

A novel platform of dispersed streaming is developed by the fog paradigm for the applications associated with the internet of things. The sensed information’s of the IOT plat form is collected from the edge device closer to the user from the lower plane and moved to the fog in the middle of the cloud and edge and then further pushed to the cloud at the top most plane. The information’s gathered at the lower plane often holds unanticipated values that are of no use in the application. These unanticipated or the unexpected data’s are termed as anomalies. These unexpected data’s could emerge either due to the improper edge device functioning which is usually the mobile devices, sensors or the actuators or the coincidences or purposeful attacks or due to environmental changes. The anomalies are supposed to be removed to retain the efficiency of the network and the application. The deep learning frame work developed in the paper involves the hardware techniques to detect the anomalies in the fog paradigm. The experimental analysis showed that the deep learning models are highly grander compared to the rest of the basic detection structures on the terms of the accuracy in detecting, false-alarm and elasticity.


2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>


2019 ◽  
Author(s):  
Mohammad Rezaei ◽  
Yanjun Li ◽  
Xiaolin Li ◽  
Chenglong Li

<b>Introduction:</b> The ability to discriminate among ligands binding to the same protein target in terms of their relative binding affinity lies at the heart of structure-based drug design. Any improvement in the accuracy and reliability of binding affinity prediction methods decreases the discrepancy between experimental and computational results.<br><b>Objectives:</b> The primary objectives were to find the most relevant features affecting binding affinity prediction, least use of manual feature engineering, and improving the reliability of binding affinity prediction using efficient deep learning models by tuning the model hyperparameters.<br><b>Methods:</b> The binding site of target proteins was represented as a grid box around their bound ligand. Both binary and distance-dependent occupancies were examined for how an atom affects its neighbor voxels in this grid. A combination of different features including ANOLEA, ligand elements, and Arpeggio atom types were used to represent the input. An efficient convolutional neural network (CNN) architecture, DeepAtom, was developed, trained and tested on the PDBbind v2016 dataset. Additionally an extended benchmark dataset was compiled to train and evaluate the models.<br><b>Results: </b>The best DeepAtom model showed an improved accuracy in the binding affinity prediction on PDBbind core subset (Pearson’s R=0.83) and is better than the recent state-of-the-art models in this field. In addition when the DeepAtom model was trained on our proposed benchmark dataset, it yields higher correlation compared to the baseline which confirms the value of our model.<br><b>Conclusions:</b> The promising results for the predicted binding affinities is expected to pave the way for embedding deep learning models in virtual screening and rational drug design fields.


Sign in / Sign up

Export Citation Format

Share Document