scholarly journals Review on Data Securing Techniques for Internet of Medical Things

2021 ◽  
Vol 3 (3) ◽  
pp. 177-191
Author(s):  
R. Kanthavel

In recent days Internet of Things (IOT) has grown up dramatically. It has wide range of applications. One of its applications is Health care system. IOT helps in managing and optimizing of healthcare system. Though it helps in all ways it also brings security problem in account. There is lot of privacy issues aroused due to IOT. In some cases it leads to risk the patient’s life. To overcome this issue we need an architecture named Internet of Medical Things (IOMT). In this paper we have discussed the problems faced by healthcare system and the authentication approaches used by Internet of Medical Things. Machine learning approaches are used to improvise the system performance.

2021 ◽  
Vol 7 ◽  
pp. e414
Author(s):  
Shilan S. Hameed ◽  
Wan Haslina Hassan ◽  
Liza Abdul Latiff ◽  
Fahad Ghabban

Background The Internet of Medical Things (IoMTs) is gradually replacing the traditional healthcare system. However, little attention has been paid to their security requirements in the development of the IoMT devices and systems. One of the main reasons can be the difficulty of tuning conventional security solutions to the IoMT system. Machine Learning (ML) has been successfully employed in the attack detection and mitigation process. Advanced ML technique can also be a promising approach to address the existing and anticipated IoMT security and privacy issues. However, because of the existing challenges of IoMT system, it is imperative to know how these techniques can be effectively utilized to meet the security and privacy requirements without affecting the IoMT systems quality, services, and device’s lifespan. Methodology This article is devoted to perform a Systematic Literature Review (SLR) on the security and privacy issues of IoMT and their solutions by ML techniques. The recent research papers disseminated between 2010 and 2020 are selected from multiple databases and a standardized SLR method is conducted. A total of 153 papers were reviewed and a critical analysis was conducted on the selected papers. Furthermore, this review study attempts to highlight the limitation of the current methods and aims to find possible solutions to them. Thus, a detailed analysis was carried out on the selected papers through focusing on their methods, advantages, limitations, the utilized tools, and data. Results It was observed that ML techniques have been significantly deployed for device and network layer security. Most of the current studies improved traditional metrics while ignored performance complexity metrics in their evaluations. Their studies environments and utilized data barely represent IoMT system. Therefore, conventional ML techniques may fail if metrics such as resource complexity and power usage are not considered.


2021 ◽  
pp. 002073142110174
Author(s):  
Md Mijanur Rahman ◽  
Fatema Khatun ◽  
Ashik Uzzaman ◽  
Sadia Islam Sami ◽  
Md Al-Amin Bhuiyan ◽  
...  

The novel coronavirus disease (COVID-19) has spread over 219 countries of the globe as a pandemic, creating alarming impacts on health care, socioeconomic environments, and international relationships. The principal objective of the study is to provide the current technological aspects of artificial intelligence (AI) and other relevant technologies and their implications for confronting COVID-19 and preventing the pandemic’s dreadful effects. This article presents AI approaches that have significant contributions in the fields of health care, then highlights and categorizes their applications in confronting COVID-19, such as detection and diagnosis, data analysis and treatment procedures, research and drug development, social control and services, and the prediction of outbreaks. The study addresses the link between the technologies and the epidemics as well as the potential impacts of technology in health care with the introduction of machine learning and natural language processing tools. It is expected that this comprehensive study will support researchers in modeling health care systems and drive further studies in advanced technologies. Finally, we propose future directions in research and conclude that persuasive AI strategies, probabilistic models, and supervised learning are required to tackle future pandemic challenges.


Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 2784 ◽  
Author(s):  
Chao Qu ◽  
Ming Tao ◽  
Ruifen Yuan

With the fast development and expansion of the Internet of Things (IoT), billions of smart devices are being continuously connected, and smart homes, as a typical IoT application, are providing people with various convenient applications, but face security and privacy issues. The idea of Blockchain (BC) theory has brought about a potential solution to the IoT security problem. The emergence of blockchain technology has brought about a change of decentralized management, providing an effective solution for the protection of network security and privacy. On the other hand, the smart devices in IoT are always lightweight and have less energy and memory. This makes the application of blockchain difficult. Against this background, this paper proposes a blockchain model based on hypergraphs. The aims of this model are to reduce the storage consumption and to solve the additional security issues. In the model, we use the hyperedge as the organization of storage nodes and convert the entire networked data storage into part network storage. We discuss the design of the model and security strategy in detail, introducing some use cases in a smart home network and evaluating the storage performance of the model through simulation experiments and an evaluation of the network.


2021 ◽  
Vol 12 (2) ◽  
pp. 1-12
Author(s):  
Nan Wang ◽  
Evangelos Katsamakas

Companies seek to leverage data and people analytics to maximize the business value of their talent. This article proposes a recommendation system for personalized workload assignment in the context of people analytics. The article describes the system, which follows a novel two-level hybrid architecture. We evaluate the system performance in a series of computational experiments and discuss future extensions. Overall, the proposed system could create significant business value as a decision support system that could help managers make better decisions. The article demonstrates how computational and machine learning approaches can complement humans in improving the performance of organizations.


Electronics ◽  
2021 ◽  
Vol 10 (16) ◽  
pp. 1955
Author(s):  
Ikram Sumaiya Thaseen ◽  
Vanitha Mohanraj ◽  
Sakthivel Ramachandran ◽  
Kishore Sanapala ◽  
Sang-Soo Yeo

In recent years, different variants of the botnet are targeting government, private organizations and there is a crucial need to develop a robust framework for securing the IoT (Internet of Things) network. In this paper, a Hadoop based framework is proposed to identify the malicious IoT traffic using a modified Tomek-link under-sampling integrated with automated Hyper-parameter tuning of machine learning classifiers. The novelty of this paper is to utilize a big data platform for benchmark IoT datasets to minimize computational time. The IoT benchmark datasets are loaded in the Hadoop Distributed File System (HDFS) environment. Three machine learning approaches namely naive Bayes (NB), K-nearest neighbor (KNN), and support vector machine (SVM) are used for categorizing IoT traffic. Artificial immune network optimization is deployed during cross-validation to obtain the best classifier parameters. Experimental analysis is performed on the Hadoop platform. The average accuracy of 99% and 90% is obtained for BoT_IoT and ToN_IoT datasets. The accuracy difference in ToN-IoT dataset is due to the huge number of data samples captured at the edge layer and fog layer. However, in BoT-IoT dataset only 5% of the training and test samples from the complete dataset are considered for experimental analysis as released by the dataset developers. The overall accuracy is improved by 19% in comparison with state-of-the-art techniques. The computational times for the huge datasets are reduced by 3–4 hours through Map Reduce in HDFS.


2020 ◽  
Author(s):  
Mazin Mohammed ◽  
Karrar Hameed Abdulkareem ◽  
Mashael S. Maashi ◽  
Salama A. Mostafa A. Mostafa ◽  
Abdullah Baz ◽  
...  

BACKGROUND In most recent times, global concern has been caused by a coronavirus (COVID19), which is considered a global health threat due to its rapid spread across the globe. Machine learning (ML) is a computational method that can be used to automatically learn from experience and improve the accuracy of predictions. OBJECTIVE In this study, the use of machine learning has been applied to Coronavirus dataset of 50 X-ray images to enable the development of directions and detection modalities with risk causes.The dataset contains a wide range of samples of COVID-19 cases alongside SARS, MERS, and ARDS. The experiment was carried out using a total of 50 X-ray images, out of which 25 images were that of positive COVIDE-19 cases, while the other 25 were normal cases. METHODS An orange tool has been used for data manipulation. To be able to classify patients as carriers of Coronavirus and non-Coronavirus carriers, this tool has been employed in developing and analysing seven types of predictive models. Models such as , artificial neural network (ANN), support vector machine (SVM), linear kernel and radial basis function (RBF), k-nearest neighbour (k-NN), Decision Tree (DT), and CN2 rule inducer were used in this study.Furthermore, the standard InceptionV3 model has been used for feature extraction target. RESULTS The various machine learning techniques that have been trained on coronavirus disease 2019 (COVID-19) dataset with improved ML techniques parameters. The data set was divided into two parts, which are training and testing. The model was trained using 70% of the dataset, while the remaining 30% was used to test the model. The results show that the improved SVM achieved a F1 of 97% and an accuracy of 98%. CONCLUSIONS :. In this study, seven models have been developed to aid the detection of coronavirus. In such cases, the learning performance can be improved through knowledge transfer, whereby time-consuming data labelling efforts are not required.the evaluations of all the models are done in terms of different parameters. it can be concluded that all the models performed well, but the SVM demonstrated the best result for accuracy metric. Future work will compare classical approaches with deep learning ones and try to obtain better results. CLINICALTRIAL None


2021 ◽  
Author(s):  
Zhen Chen ◽  
Pei Zhao ◽  
Chen Li ◽  
Fuyi Li ◽  
Dongxu Xiang ◽  
...  

Abstract Sequence-based analysis and prediction are fundamental bioinformatic tasks that facilitate understanding of the sequence(-structure)-function paradigm for DNAs, RNAs and proteins. Rapid accumulation of sequences requires equally pervasive development of new predictive models, which depends on the availability of effective tools that support these efforts. We introduce iLearnPlus, the first machine-learning platform with graphical- and web-based interfaces for the construction of machine-learning pipelines for analysis and predictions using nucleic acid and protein sequences. iLearnPlus provides a comprehensive set of algorithms and automates sequence-based feature extraction and analysis, construction and deployment of models, assessment of predictive performance, statistical analysis, and data visualization; all without programming. iLearnPlus includes a wide range of feature sets which encode information from the input sequences and over twenty machine-learning algorithms that cover several deep-learning approaches, outnumbering the current solutions by a wide margin. Our solution caters to experienced bioinformaticians, given the broad range of options, and biologists with no programming background, given the point-and-click interface and easy-to-follow design process. We showcase iLearnPlus with two case studies concerning prediction of long noncoding RNAs (lncRNAs) from RNA transcripts and prediction of crotonylation sites in protein chains. iLearnPlus is an open-source platform available at https://github.com/Superzchen/iLearnPlus/ with the webserver at http://ilearnplus.erc.monash.edu/.


Author(s):  
Rajasekaran Thangaraj ◽  
Sivaramakrishnan Rajendar ◽  
Vidhya Kandasamy

Healthcare motoring has become a popular research in recent years. The evolution of electronic devices brings out numerous wearable devices that can be used for a variety of healthcare motoring systems. These devices measure the patient's health parameters and send them for further processing, where the acquired data is analyzed. The analysis provides the patients or their relatives with the medical support required or predictions based on the acquired data. Cloud computing, deep learning, and machine learning technologies play a prominent role in processing and analyzing the data respectively. This chapter aims to provide a detailed study of IoT-based healthcare systems, a variety of sensors used to measure parameters of health, and various deep learning and machine learning approaches introduced for the diagnosis of different diseases. The chapter also highlights the challenges, open issues, and performance considerations for future IoT-based healthcare research.


Sign in / Sign up

Export Citation Format

Share Document