scholarly journals Improving the Security and Confidentiality in the Internet of Medical Things Based on Edge Computing Using Clustering

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Anita Hatamian ◽  
Mohammad Bagher Tavakoli ◽  
Masoud Moradkhani

Families, physicians, and hospital environments use remote patient monitoring (RPM) technologies to remotely monitor a patient’s vital signs, reduce visit time, reduce hospital costs, and improve the quality of care. The Internet of Medical Things (IoMT) is provided by applications that provide remote access to patient’s physiological data. The Internet of Medical Things (IoMT) tools basically have a user interface, biosensor, and Internet connectivity. Accordingly, it is possible to record, transfer, store, and process medical data in a short time by integrating IoMT with the data communication infrastructure in edge computing. (Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data. This is expected to improve response times and save bandwidth. A common misconception is that edge and IoT are synonymous.) But, this approach faces problems with security and intrusion into users’ medical data that are confidential. Accordingly, this study presents a secure solution in order to be used in the IoT infrastructure in edge computing. In the proposed method, first the clustering process is performed effectively using information about the characteristics and interests of users. Then, the people in each cluster evaluated by using edge computing and people with higher scores are considered as influential people in their cluster, and since users with high user interaction can publish information on a large scale, it can be concluded that, by increasing user interaction, information can be disseminated on a larger scale without any intrusion and thus in a safe way in the network. In the proposed method, the average of user interactions and user scores are used as a criterion for identifying influential people in each cluster. If there is a desired number of people who are considered to start disseminating information, it is possible to select people in each cluster with a higher degree of influence to start disseminating information. According to the research results, the accuracy has increased by 0.2 and more information is published in the proposed method than the previous methods.

Author(s):  
S. Gopikrishnan ◽  
P. Priakanth ◽  
Gautam Srivastava ◽  
Giancarlo Fortino

2019 ◽  
Vol 11 (4) ◽  
pp. 100 ◽  
Author(s):  
Maurizio Capra ◽  
Riccardo Peloso ◽  
Guido Masera ◽  
Massimo Ruo Roch ◽  
Maurizio Martina

In today’s world, ruled by a great amount of data and mobile devices, cloud-based systems are spreading all over. Such phenomenon increases the number of connected devices, broadcast bandwidth, and information exchange. These fine-grained interconnected systems, which enable the Internet connectivity for an extremely large number of facilities (far beyond the current number of devices) go by the name of Internet of Things (IoT). In this scenario, mobile devices have an operating time which is proportional to the battery capacity, the number of operations performed per cycle and the amount of exchanged data. Since the transmission of data to a central cloud represents a very energy-hungry operation, new computational paradigms have been implemented. The computation is not completely performed in the cloud, distributing the power load among the nodes of the system, and data are compressed to reduce the transmitted power requirements. In the edge-computing paradigm, part of the computational power is moved toward data collection sources, and, only after a first elaboration, collected data are sent to the central cloud server. Indeed, the “edge” term refers to the extremities of systems represented by IoT devices. This survey paper presents the hardware architectures of typical IoT devices and sums up many of the low power techniques which make them appealing for a large scale of applications. An overview of the newest research topics is discussed, besides a final example of a complete functioning system, embedding all the introduced features.


2020 ◽  
Vol 2020 ◽  
pp. 1-9 ◽  
Author(s):  
Maria-Dolores Cano ◽  
Antonio Cañavate-Sanchez

The disclosure of personal and private information is one of the main challenges of the Internet of Medical Things (IoMT). Most IoMT-based services, applications, and platforms follow a common architecture where wearables or other medical devices capture data that are forwarded to the cloud. In this scenario, edge computing brings new opportunities to enhance the operation of IoMT. However, despite the benefits, the inherent characteristics of edge computing require countermeasures to address the security and privacy issues that IoMT gives rise to. The restrictions of IoT devices in terms of battery, memory, hardware resources, or computing capabilities have led to a common agreement for the use of elliptic curve cryptography (ECC) with hardware or software implementations. As an example, the elliptic curve digital signature algorithm (ECDSA) is widely used by IoT devices to compute digital signatures. On the other hand, it is well known that dual signature has been an effective method to provide consumer privacy in classic e-commerce services. This article joins both approaches. It presents a novel solution to enhanced security and the preservation of data privacy in communications between IoMT devices and the cloud via edge computing devices. While data source anonymity is achieved from the cloud perspective, integrity and origin authentication of the collected data is also provided. In addition, computational requirements and complexity are kept to a minimum.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2410
Author(s):  
Muhammad Firdaus ◽  
Sandi Rahmadika ◽  
Kyung-Hyune Rhee

The emergence of the Internet of Vehicles (IoV) aims to facilitate the next generation of intelligent transportation system (ITS) applications by combining smart vehicles and the internet to improve traffic safety and efficiency. On the other hand, mobile edge computing (MEC) technology provides enormous storage resources with powerful computing on the edge networks. Hence, the idea of IoV edge computing (IoVEC) networks has grown to be an assuring paradigm with various opportunities to advance massive data storage, data sharing, and computing processing close to vehicles. However, the participant’s vehicle may be unwilling to share their data since the data-sharing system still relies on a centralized server approach with the potential risk of data leakage and privacy security. In addition, vehicles have difficulty evaluating the credibility of the messages they received because of untrusted environments. To address these challenges, we propose consortium blockchain and smart contracts to accomplish a decentralized trusted data sharing management system in IoVEC. This system allows vehicles to validate the credibility of messages from their neighboring by generating a reputation rating. Moreover, the incentive mechanism is utilized to trigger the vehicles to store and share their data honestly; thus, they will obtain certain rewards from the system. Simulation results substantially display an efficient network performance along with forming an appropriate incentive model to reach a decentralized trusted data sharing management of IoVEC networks.


2020 ◽  
Author(s):  
Dhairya Patel ◽  
Sabah Mohammed

<p><b>The given literature focuses on developing a Smart Factory model based on Cloud and Edge computing used to develop Transportation Management System(TMS) using a iFogSim wrapper. Cloud computing identifies data centres for users and offer computer system services on-demand, including data storage and processing power, without direct active user management. In the smart industry, several devices are connected together across the internet, where vast volumes of data are collected during the entire process of output. Thus, to handle this data smart factory based on cloud and edge computing is used. The intelligent cloud-based factory offers some facility like large scale analysis of data. Concepts like fog and edge computing play a significant role in extending data storage and network capacities in the cloud that addresses some challenges, such as over-full bandwidth and latency. The literature also focuses on the implementation of TMS using the iFogSim Simulator. The simulator provides efficient execution of TMS by showing the amount of resources used which gives an idea regarding optimum use of resources. All types of data related to TMS is obtained at cloud by using smart factory like object location, time taken and energy consumption. To implement the TMS we have created a topology which displays various devices connected to the cloud which gives necessary information regarding the ongoing transportation simulation.</b></p>


Author(s):  
Aleksandar Tošić ◽  
Jernej Vičič ◽  
Michael David Burnard ◽  
Michael Mrissa

The Internet of Things (IoT) is experiencing widespread adoption across industry sectors ranging from supply chain management to smart cities, buildings, and health monitoring. However, most software architectures for IoT deployment rely on centralized cloud computing infrastructures to provide storage and computing power, as cloud providers have high economic incentives to organize their infrastructure into clusters. Despite these incentives, there has been a recent shift from centralized to decentralized architecture that harnesses the potential of edge devices, reduces network latency, and lowers infrastructure cost to support IoT applications. This shift has resulted in new edge computing architectures, but many still rely on centralized solutions for managing applications. A truly decentralized approach would offer interesting properties required for IoT use cases. To address these concerns, we introduce a decentralized architecture tailored for large scale deployments of peer-to-peer IoT sensor networks and capable of run-time application migration. The solution combines a blockchain consensus algorithm and verifiable random functions to ensure scalability, fault tolerance, transparency, and no single point of failure. We build on our previously presented theoretical simulations with many protocol improvements and an implementation tested in a use case related to monitoring a Slovenian cultural heritage building located in Bled, Slovenia.


Author(s):  
Dazhong Wu ◽  
Janis Terpenny ◽  
Li Zhang ◽  
Robert Gao ◽  
Thomas Kurfess

Over the past few decades, both small- and medium-sized manufacturers as well as large original equipment manufacturers (OEMs) have been faced with an increasing need for low cost and scalable intelligent manufacturing machines. Capabilities are needed for collecting and processing large volumes of real-time data generated from manufacturing machines and processes as well as for diagnosing the root cause of identified defects, predicting their progression, and forecasting maintenance actions proactively to minimize unexpected machine down times. Although cloud computing enables ubiquitous and instant remote access to scalable information and communication technology (ICT) infrastructures and high volume data storage, it has limitations in latency-sensitive applications such as high performance computing and real-time stream analytics. The emergence of fog computing, Internet of Things (IoT), and cyber-physical systems (CPS) represent radical changes in the way sensing systems, along with ICT infrastructures, collect and analyze large volumes of real-time data streams in geographically distributed environments. Ultimately, such technological approaches enable machines to function as an agent that is capable of intelligent behaviors such as automatic fault and failure detection, self-diagnosis, and preventative maintenance scheduling. The objective of this research is to introduce a fog-enabled architecture that consists of smart sensor networks, communication protocols, parallel machine learning software, and private and public clouds. The fog-enabled architecture will have the potential to enable large-scale, geographically distributed online machine and process monitoring, diagnosis, and prognosis that require low latency and high bandwidth in the context of data-driven cyber-manufacturing systems.


2014 ◽  
Vol 1051 ◽  
pp. 573-577
Author(s):  
Jin Dong Yu ◽  
He Huang ◽  
Guang Rong Chen

Water quality on-line monitoring is the necessary means to control water environment pollution and provide effective data for the research of water ecological restoration. Using free wireless communicating module of ZigBee we realized transmitting data from sensors on the river bank to the data storage center. Based on serial ports of micro-controller a protocol converter circuit was designed. With program the Modbus RTU protocol data packet on RS-485 bus was converted into ZigBee communication protocol and then transmitted to the Internet. Supplemented by GSM the reliability of data was enhanced. Water quality online monitoring system based on multiple data communication provides browse of real-time river water quality through the Internet network, which is fast and convenient to be exported for research.


2020 ◽  
Author(s):  
Dhairya Patel ◽  
Sabah Mohammed

<p><b>The given literature focuses on developing a Smart Factory model based on Cloud and Edge computing used to develop Transportation Management System(TMS) using a iFogSim wrapper. Cloud computing identifies data centres for users and offer computer system services on-demand, including data storage and processing power, without direct active user management. In the smart industry, several devices are connected together across the internet, where vast volumes of data are collected during the entire process of output. Thus, to handle this data smart factory based on cloud and edge computing is used. The intelligent cloud-based factory offers some facility like large scale analysis of data. Concepts like fog and edge computing play a significant role in extending data storage and network capacities in the cloud that addresses some challenges, such as over-full bandwidth and latency. The literature also focuses on the implementation of TMS using the iFogSim Simulator. The simulator provides efficient execution of TMS by showing the amount of resources used which gives an idea regarding optimum use of resources. All types of data related to TMS is obtained at cloud by using smart factory like object location, time taken and energy consumption. To implement the TMS we have created a topology which displays various devices connected to the cloud which gives necessary information regarding the ongoing transportation simulation.</b></p>


Author(s):  
Hadrian Peter

Over the past ten years or so data warehousing has emerged as a new technology in the database environment. “A data warehouse is a global repository that stores pre-processed queries on data which resides in multiple, possibly heterogeneous, operational or legacy sources” (Samtani et al, 2004). Data warehousing as a specialized field is continuing to grow and mature. Despite the phenomenal upgrades in terms of data storage capability there has been a flood of new streams of data entering the warehouse. During the last decade there has been an increase from 1 terabyte to 100 terabyte and, soon to be 1 petabyte, environments. Therefore, the ability to search, mine and analyze data of such immense proportions remains a significant issue even as analytical capabilities increase. The data warehouse is an environment which is readily tuned to maximize the efficiency of making useful decisions. However the advent of commercial uses of the Internet on a large scale has opened new possibilities for data capture and integration into the warehouse. While most of the data necessary for a data warehouse originates from the organization’s internal (operational) data sources, additional data is available externally that can add significant value to the data warehouse. One of the major reasons why organizations implement data warehousing is to make it easier, on a regular basis, to query and report data from multiple transaction processing systems and/or from external sources. One important source of this external data is the Internet. A few researchers (Walters, 1997; Strand & Olsson, 2004; Strand & Wangler, 2004) have investigated the possibility of incorporating external data in data warehouses, however, there is little literature detailing research in which the Internet is the source of the external data. In (Peter & Greenidge, 2005) a high-level model, the Data Warehousing Search Engine (DWSE), was presented. However, in this article we examine in some detail the issues in search engine technology that make the Internet a plausible and reliable source for external data. As John Ladley (Ladley, 2005) states “There is a new generation of Data Warehousing on the horizon that reflects maturing technology and attitudes”. Our long-term goal is to design this new generation Data Warehouse.


Sign in / Sign up

Export Citation Format

Share Document