ACM Transactions on Internet of Things
Latest Publications


TOTAL DOCUMENTS

64
(FIVE YEARS 64)

H-INDEX

2
(FIVE YEARS 2)

Published By Association For Computing Machinery (ACM)

2577-6207, 2577-6207

2022 ◽  
Vol 3 (1) ◽  
pp. 1-23
Author(s):  
Mao V. Ngo ◽  
Tie Luo ◽  
Tony Q. S. Quek

The advances in deep neural networks (DNN) have significantly enhanced real-time detection of anomalous data in IoT applications. However, the complexity-accuracy-delay dilemma persists: Complex DNN models offer higher accuracy, but typical IoT devices can barely afford the computation load, and the remedy of offloading the load to the cloud incurs long delay. In this article, we address this challenge by proposing an adaptive anomaly detection scheme with hierarchical edge computing (HEC). Specifically, we first construct multiple anomaly detection DNN models with increasing complexity and associate each of them to a corresponding HEC layer. Then, we design an adaptive model selection scheme that is formulated as a contextual-bandit problem and solved by using a reinforcement learning policy network . We also incorporate a parallelism policy training method to accelerate the training process by taking advantage of distributed models. We build an HEC testbed using real IoT devices and implement and evaluate our contextual-bandit approach with both univariate and multivariate IoT datasets. In comparison with both baseline and state-of-the-art schemes, our adaptive approach strikes the best accuracy-delay tradeoff on the univariate dataset and achieves the best accuracy and F1-score on the multivariate dataset with only negligibly longer delay than the best (but inflexible) scheme.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-24
Author(s):  
Sizhe An ◽  
Yigit Tuncel ◽  
Toygun Basaklar ◽  
Gokul K. Krishnakumar ◽  
Ganapati Bhat ◽  
...  

Movement disorders, such as Parkinson’s disease, affect more than 10 million people worldwide. Gait analysis is a critical step in the diagnosis and rehabilitation of these disorders. Specifically, step and stride lengths provide valuable insights into the gait quality and rehabilitation process. However, traditional approaches for estimating step length are not suitable for continuous daily monitoring since they rely on special mats and clinical environments. To address this limitation, this article presents a novel and practical step-length estimation technique using low-power wearable bend and inertial sensors. Experimental results show that the proposed model estimates step length with 5.49% mean absolute percentage error and provides accurate real-time feedback to the user.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-19
Author(s):  
Feng Lu ◽  
Wei Li ◽  
Song Lin ◽  
Chengwangli Peng ◽  
Zhiyong Wang ◽  
...  

Wireless capsule endoscopy is a modern non-invasive Internet of Medical Imaging Things that has been increasingly used in gastrointestinal tract examination. With about one gigabyte image data generated for a patient in each examination, automatic lesion detection is highly desirable to improve the efficiency of the diagnosis process and mitigate human errors. Despite many approaches for lesion detection have been proposed, they mainly focus on large lesions and are not directly applicable to tiny lesions due to the limitations of feature representation. As bleeding lesions are a common symptom in most serious gastrointestinal diseases, detecting tiny bleeding lesions is extremely important for early diagnosis of those diseases, which is highly relevant to the survival, treatment, and expenses of patients. In this article, a method is proposed to extract and fuse multi-scale deep features for detecting and locating both large and tiny lesions. A feature extracting network is first used as our backbone network to extract the basic features from wireless capsule endoscopy images, and then at each layer multiple regions could be identified as potential lesions. As a result, the features maps of those potential lesions are obtained at each level and fused in a top-down manner to the fully connected layer for producing final detection results. Our proposed method has been evaluated on a clinical dataset that contains 20,000 wireless capsule endoscopy images with clinical annotation. Experimental results demonstrate that our method can achieve 98.9% prediction accuracy and 93.5% score, which has a significant performance improvement of up to 31.69% and 22.12% in terms of recall rate and score, respectively, when compared to the state-of-the-art approaches for both large and tiny bleeding lesions. Moreover, our model also has the highest AP and the best medical diagnosis performance compared to state-of-the-art multi-scale models.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-26
Author(s):  
Omid Hajihassani ◽  
Omid Ardakanian ◽  
Hamzeh Khazaei

The abundance of data collected by sensors in Internet of Things devices and the success of deep neural networks in uncovering hidden patterns in time series data have led to mounting privacy concerns. This is because private and sensitive information can be potentially learned from sensor data by applications that have access to this data. In this article, we aim to examine the tradeoff between utility and privacy loss by learning low-dimensional representations that are useful for data obfuscation. We propose deterministic and probabilistic transformations in the latent space of a variational autoencoder to synthesize time series data such that intrusive inferences are prevented while desired inferences can still be made with sufficient accuracy. In the deterministic case, we use a linear transformation to move the representation of input data in the latent space such that the reconstructed data is likely to have the same public attribute but a different private attribute than the original input data. In the probabilistic case, we apply the linear transformation to the latent representation of input data with some probability. We compare our technique with autoencoder-based anonymization techniques and additionally show that it can anonymize data in real time on resource-constrained edge devices.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-30
Author(s):  
Nisha Panwar ◽  
Shantanu Sharma ◽  
Guoxi Wang ◽  
Sharad Mehrotra ◽  
Nalini Venkatasubramanian ◽  
...  

Contemporary IoT environments, such as smart buildings, require end-users to trust data-capturing rules published by the systems. There are several reasons why such a trust is misplaced—IoT systems may violate the rules deliberately or IoT devices may transfer user data to a malicious third-party due to cyberattacks, leading to the loss of individuals’ privacy or service integrity. To address such concerns, we propose IoT Notary , a framework to ensure trust in IoT systems and applications. IoT Notary provides secure log sealing on live sensor data to produce a verifiable “proof-of-integrity,” based on which a verifier can attest that captured sensor data adhere to the published data-capturing rules. IoT Notary is an integral part of TIPPERS, a smart space system that has been deployed at the University of California, Irvine to provide various real-time location-based services on the campus. We present extensive experiments over real-time WiFi connectivity data to evaluate IoT Notary , and the results show that IoT Notary imposes nominal overheads. The secure logs only take 21% more storage, while users can verify their one day’s data in less than 2 s even using a resource-limited device.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-18
Author(s):  
Anna Lito Michala ◽  
Ioannis Vourganas ◽  
Andrea Coraddu

IoT and the Cloud are among the most disruptive changes in the way we use data today. These changes have not significantly influenced practices in condition monitoring for shipping. This is partly due to the cost of continuous data transmission. Several vessels are already equipped with a network of sensors. However, continuous monitoring is often not utilised and onshore visibility is obscured. Edge computing is a promising solution but there is a challenge sustaining the required accuracy for predictive maintenance. We investigate the use of IoT systems and Edge computing, evaluating the impact of the proposed solution on the decision making process. Data from a sensor and the NASA-IMS open repository were used to show the effectiveness of the proposed system and to evaluate it in a realistic maritime application. The results demonstrate our real-time dynamic intelligent reduction of transmitted data volume by without sacrificing specificity or sensitivity in decision making. The output of the Decision Support System fully corresponds to the monitored system's actual operating condition and the output when the raw data are used instead. The results demonstrate that the proposed more efficient approach is just as effective for the decision making process.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-30
Author(s):  
Ajay Krishna ◽  
Michel Le Pallec ◽  
Radu Mateescu ◽  
Gwen Salaün

Consumer Internet of Things (IoT) applications are largely built through end-user programming in the form of event-action rules. Although end-user tools help simplify the building of IoT applications to a large extent, there are still challenges in developing expressive applications in a simple yet correct fashion. In this context, we propose a formal development framework based on the Web of Things specification. An application is defined using a composition language that allows users to compose the basic event-action rules to express complex scenarios. It is transformed into a formal specification that serves as the input for formal analysis, where the application is checked for functional and quantitative properties at design time using model checking techniques. Once the application is validated, it can be deployed and the rules are executed following the composition language semantics. We have implemented these proposals in a tool built on top of the Mozilla WebThings platform. The steps from design to deployment were validated on real-world applications.


2022 ◽  
Vol 3 (1) ◽  
pp. 1-31
Author(s):  
Roman Trüb ◽  
Reto Da Forno ◽  
Lukas Daschinger ◽  
Andreas Biri ◽  
Jan Beutel ◽  
...  

Testbeds for wireless IoT devices facilitate testing and validation of distributed target nodes. A testbed usually provides methods to control, observe, and log the execution of the software. However, most of the methods used for tracing the execution require code instrumentation and change essential properties of the observed system. Methods that are non-intrusive are typically not applicable in a distributed fashion due to a lack of time synchronization or necessary hardware/software support. In this article, we present a tracing system for validating time-critical software running on multiple distributed wireless devices that does not require code instrumentation, is non-intrusive and is designed to trace the distributed state of an entire network. For this purpose, we make use of the on-chip debug and trace hardware that is part of most modern microcontrollers. We introduce a testbed architecture as well as models and methods that accurately synchronize the timestamps of observations collected by distributed observers. In a case study, we demonstrate how the tracing system can be applied to observe the distributed state of a flooding-based low-power communication protocol for wireless sensor networks. The presented non-intrusive tracing system is implemented as a service of the publicly accessible open source FlockLab 2 testbed.


2021 ◽  
Vol 2 (4) ◽  
pp. 1-26
Author(s):  
Bo Wei ◽  
Kai Li ◽  
Chengwen Luo ◽  
Weitao Xu ◽  
Jin Zhang ◽  
...  

Device-free context awareness is important to many applications. There are two broadly used approaches for device-free context awareness, i.e., video-based and radio-based. Video-based approaches can deliver good performance, but privacy is a serious concern. Radio-based context awareness applications have drawn researchers' attention instead, because it does not violate privacy and radio signal can penetrate obstacles. The existing works design explicit methods for each radio-based application. Furthermore, they use one additional step to extract features before conducting classification and exploit deep learning as a classification tool. Although this feature extraction step helps explore patterns of raw signals, it generates unnecessary noise and information loss. The use of raw CSI signal without initial data processing was, however, considered as no usable patterns. In this article, we are the first to propose an innovative deep learning–based general framework for both signal processing and classification. The key novelty of this article is that the framework can be generalised for all the radio-based context awareness applications with the use of raw CSI. We also eliminate the extra work to extract features from raw radio signals. We conduct extensive evaluations to show the superior performance of our proposed method and its generalisation.


2021 ◽  
Vol 2 (4) ◽  
pp. 1-23
Author(s):  
Raed Abdel Sater ◽  
A. Ben Hamza

Internet of Things (IoT) sensors in smart buildings are becoming increasingly ubiquitous, making buildings more livable, energy efficient, and sustainable. These devices sense the environment and generate multivariate temporal data of paramount importance for detecting anomalies and improving the prediction of energy usage in smart buildings. However, detecting these anomalies in centralized systems is often plagued by a huge delay in response time. To overcome this issue, we formulate the anomaly detection problem in a federated learning setting by leveraging the multi-task learning paradigm, which aims at solving multiple tasks simultaneously while taking advantage of the similarities and differences across tasks. We propose a novel privacy-by-design federated learning model using a stacked long short-time memory (LSTM) model, and we demonstrate that it is more than twice as fast during training convergence compared to the centralized LSTM. The effectiveness of our federated learning approach is demonstrated on three real-world datasets generated by the IoT production system at General Electric Current smart building, achieving state-of-the-art performance compared to baseline methods in both classification and regression tasks. Our experimental results demonstrate the effectiveness of the proposed framework in reducing the overall training cost without compromising the prediction performance.


Sign in / Sign up

Export Citation Format

Share Document