scholarly journals IoT Network Security: Threats, Risks, and a Data-Driven Defense Framework

IoT ◽  
2020 ◽  
Vol 1 (2) ◽  
pp. 259-285 ◽  
Author(s):  
Charles Wheelus ◽  
Xingquan Zhu

The recent surge in Internet of Things (IoT) deployment has increased the pace of integration and extended the reach of the Internet from computers, tablets and phones to a myriad of devices in our physical world. Driven by the IoT, with each passing day, the Internet becomes more integrated with everyday life. While IoT devices provide endless new capabilities and make life more convenient, they also vastly increase the opportunity for nefarious individuals, criminal organizations and even state actors to spy on, and interfere with, unsuspecting users of IoT systems. As this looming crisis continues to grow, calls for data science approaches to address these problems have increased, and current research shows that predictive models trained with machine learning algorithms hold great potential to mitigate some of these issues. In this paper, we first carry out an analytics approach to review security risks associated with IoT systems, and then propose a machine learning-based solution to characterize and detect IoT attacks. We use a real-world IoT system with secured gate access as a platform, and introduce the IoT system in detail, including features to capture security threats/attacks to the system. By using data collected from a nine month period as our testbed, we evaluate the efficacy of predictive models trained by means of machine learning, and propose design principles and a loose framework for implementing secure IoT systems.

Author(s):  
R. Suganya ◽  
Rajaram S. ◽  
Kameswari M.

Currently, thyroid disorders are more common and widespread among women worldwide. In India, seven out of ten women are suffering from thyroid problems. Various research literature studies predict that about 35% of Indian women are examined with prevalent goiter. It is very necessary to take preventive measures at its early stages, otherwise it causes infertility problem among women. The recent review discusses various analytics models that are used to handle different types of thyroid problems in women. This chapter is planned to analyze and compare different classification models, both machine learning algorithms and deep leaning algorithms, to classify different thyroid problems. Literature from both machine learning and deep learning algorithms is considered. This literature review on thyroid problems will help to analyze the reason and characteristics of thyroid disorder. The dataset used to build and to validate the algorithms was provided by UCI machine learning repository.


2017 ◽  
Vol 135 (3) ◽  
pp. 234-246 ◽  
Author(s):  
André Rodrigues Olivera ◽  
Valter Roesler ◽  
Cirano Iochpe ◽  
Maria Inês Schmidt ◽  
Álvaro Vigo ◽  
...  

ABSTRACT CONTEXT AND OBJECTIVE: Type 2 diabetes is a chronic disease associated with a wide range of serious health complications that have a major impact on overall health. The aims here were to develop and validate predictive models for detecting undiagnosed diabetes using data from the Longitudinal Study of Adult Health (ELSA-Brasil) and to compare the performance of different machine-learning algorithms in this task. DESIGN AND SETTING: Comparison of machine-learning algorithms to develop predictive models using data from ELSA-Brasil. METHODS: After selecting a subset of 27 candidate variables from the literature, models were built and validated in four sequential steps: (i) parameter tuning with tenfold cross-validation, repeated three times; (ii) automatic variable selection using forward selection, a wrapper strategy with four different machine-learning algorithms and tenfold cross-validation (repeated three times), to evaluate each subset of variables; (iii) error estimation of model parameters with tenfold cross-validation, repeated ten times; and (iv) generalization testing on an independent dataset. The models were created with the following machine-learning algorithms: logistic regression, artificial neural network, naïve Bayes, K-nearest neighbor and random forest. RESULTS: The best models were created using artificial neural networks and logistic regression. These achieved mean areas under the curve of, respectively, 75.24% and 74.98% in the error estimation step and 74.17% and 74.41% in the generalization testing step. CONCLUSION: Most of the predictive models produced similar results, and demonstrated the feasibility of identifying individuals with highest probability of having undiagnosed diabetes, through easily-obtained clinical data.


Author(s):  
Anjum Nazir Qureshi Sheikh ◽  
Asha Ambhaikar ◽  
Sunil Kumar

The internet of things is a versatile technology that helps to connect devices with other devices or humans in any part of the world at any time. Some of the researchers claim that the number of IoT devices around the world will surpass the total population on the earth after a few years. The technology has made life easier, but these comforts are backed up with a lot of security threats. Wireless medium for communication, large amount of data, and device constraints of the IoT devices are some of the factors that increase their vulnerability to security threats. This chapter provides information about the attacks at different layers of IoT architecture. It also mentions the benefits of technologies like blockchain and machine learning that can help to solve the security issues of IoT.


Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 2919
Author(s):  
Rami J. Alzahrani ◽  
Ahmed Alzahrani

The recent advance in information technology has created a new era named the Internet of Things (IoT). This new technology allows objects (things) to be connected to the Internet, such as smart TVs, printers, cameras, smartphones, smartwatches, etc. This trend provides new services and applications for many users and enhances their lifestyle. The rapid growth of the IoT makes the incorporation and connection of several devices a predominant procedure. Although there are many advantages of IoT devices, there are different challenges that come as network anomalies. In this research, the current studies in the use of deep learning (DL) in DDoS intrusion detection have been presented. This research aims to implement different Machine Learning (ML) algorithms in WEKA tools to analyze the detection performance for DDoS attacks using the most recent CICDDoS2019 datasets. CICDDoS2019 was found to be the model with best results. This research has used six different types of ML algorithms which are K_Nearest_Neighbors (K-NN), super vector machine (SVM), naïve bayes (NB), decision tree (DT), random forest (RF) and logistic regression (LR). The best accuracy result in the presented evaluation was achieved when utilizing the Decision Tree (DT) and Random Forest (RF) algorithms, 99% and 99%, respectively. However, the DT is better than RF because it has a shorter computation time, 4.53 s and 84.2 s, respectively. Finally, open issues for further research in future work are presented.


2022 ◽  
pp. 218-237
Author(s):  
Virginia M. Miori ◽  
John Yi ◽  
Rashmi Malhotra ◽  
Ronald K. Klimberg

The use of information technology and decision support concepts at the operational business level were slow to take hold in the 20th century. In 2010, the authors documented the evolution and current state of the field of business intelligence and analytics (BIA). In the last decade, however, through the resurgence and mainstream use of artificial intelligence, machine learning algorithms, the development of inexpensive cloud-based mass storage, and the internet-of-things, business intelligence has evolved into data science. In this chapter, the authors trace this evolution across the diverse areas of data science and identify extremely useful advancements and best practices in the field.


Author(s):  
Phidahunlang Chyne ◽  
Parag Chatterjee ◽  
Sugata Sanyal ◽  
Debdatta Kandar

Rapid advancements in hardware programming and communication innovations have encouraged the development of internet-associated sensory devices that give perceptions and information measurements from the physical world. According to the internet of things (IoT) analytics, more than 100 IoT devices across the world connect to the internet every second, which in the coming years will sharply increase the number of IoT devices by billions. This number of IoT devices incorporates new dynamic associations and does not totally replace the devices that were purchased before yet are not utilized any longer. As an increasing number of IoT devices advance into the world, conveyed in uncontrolled, complex, and frequently hostile conditions, securing IoT frameworks displays various challenges. As per the Eclipse IoT Working Group's 2017 IoT engineer overview, security is the top worry for IoT designers. To approach the challenges in securing IoT devices, the authors propose using unsupervised machine learning model at the network/transport level for anomaly detection.


Sensors ◽  
2020 ◽  
Vol 20 (9) ◽  
pp. 2533 ◽  
Author(s):  
Massimo Merenda ◽  
Carlo Porcaro ◽  
Demetrio Iero

In a few years, the world will be populated by billions of connected devices that will be placed in our homes, cities, vehicles, and industries. Devices with limited resources will interact with the surrounding environment and users. Many of these devices will be based on machine learning models to decode meaning and behavior behind sensors’ data, to implement accurate predictions and make decisions. The bottleneck will be the high level of connected things that could congest the network. Hence, the need to incorporate intelligence on end devices using machine learning algorithms. Deploying machine learning on such edge devices improves the network congestion by allowing computations to be performed close to the data sources. The aim of this work is to provide a review of the main techniques that guarantee the execution of machine learning models on hardware with low performances in the Internet of Things paradigm, paving the way to the Internet of Conscious Things. In this work, a detailed review on models, architecture, and requirements on solutions that implement edge machine learning on Internet of Things devices is presented, with the main goal to define the state of the art and envisioning development requirements. Furthermore, an example of edge machine learning implementation on a microcontroller will be provided, commonly regarded as the machine learning “Hello World”.


Telecom IT ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 50-55
Author(s):  
D. Saharov ◽  
D. Kozlov

The article deals with the СoAP Protocol that regulates the transmission and reception of information traf-fic by terminal devices in IoT networks. The article describes a model for detecting abnormal traffic in 5G/IoT networks using machine learning algorithms, as well as the main methods for solving this prob-lem. The relevance of the article is due to the wide spread of the Internet of things and the upcoming update of mobile networks to the 5g generation.


2021 ◽  
Vol 13 (13) ◽  
pp. 2433
Author(s):  
Shu Yang ◽  
Fengchao Peng ◽  
Sibylle von Löwis ◽  
Guðrún Nína Petersen ◽  
David Christian Finger

Doppler lidars are used worldwide for wind monitoring and recently also for the detection of aerosols. Automatic algorithms that classify the lidar signals retrieved from lidar measurements are very useful for the users. In this study, we explore the value of machine learning to classify backscattered signals from Doppler lidars using data from Iceland. We combined supervised and unsupervised machine learning algorithms with conventional lidar data processing methods and trained two models to filter noise signals and classify Doppler lidar observations into different classes, including clouds, aerosols and rain. The results reveal a high accuracy for noise identification and aerosols and clouds classification. However, precipitation detection is underestimated. The method was tested on data sets from two instruments during different weather conditions, including three dust storms during the summer of 2019. Our results reveal that this method can provide an efficient, accurate and real-time classification of lidar measurements. Accordingly, we conclude that machine learning can open new opportunities for lidar data end-users, such as aviation safety operators, to monitor dust in the vicinity of airports.


Sign in / Sign up

Export Citation Format

Share Document