scholarly journals DTA-PUF: Dynamic Timing-aware Physical Unclonable Function for Resource-constrained Devices

2021 ◽  
Vol 17 (3) ◽  
pp. 1-24
Author(s):  
Ioannis Tsiokanos ◽  
Jack Miskelly ◽  
Chongyan Gu ◽  
Maire O’neill ◽  
Georgios Karakonstantis

In recent years, physical unclonable functions (PUFs) have gained a lot of attention as mechanisms for hardware-rooted device authentication. While the majority of the previously proposed PUFs derive entropy using dedicated circuitry, software PUFs achieve this from existing circuitry in a system. Such software-derived designs are highly desirable for low-power embedded systems as they require no hardware overhead. However, these software PUFs induce considerable processing overheads that hinder their adoption in resource-constrained devices. In this article, we propose DTA-PUF, a novel, software PUF design that exploits the instruction- and data-dependent dynamic timing behaviour of pipelined cores to provide a reliable challenge-response mechanism without requiring any extra hardware. DTA-PUF accepts sequences of instructions as an input challenge and produces an output response based on the manifested timing errors under specific over-clocked settings. To lower the required processing effort, we systematically select instruction sequences that maximise error-rate. The application to a post-layout pipelined floating-point unit, which is implemented in 45 nm process technology, demonstrates the effectiveness and practicability of our PUF design. Finally, DTA-PUF requires up to 50× fewer instructions than existing software processor PUF designs, limiting processing costs and resulting in up to 26% power savings.

Informatica ◽  
2017 ◽  
Vol 28 (1) ◽  
pp. 193-214 ◽  
Author(s):  
Tung-Tso Tsai ◽  
Sen-Shan Huang ◽  
Yuh-Min Tseng

2021 ◽  
Vol 5 (4) ◽  
pp. 1-28
Author(s):  
Chia-Heng Tu ◽  
Qihui Sun ◽  
Hsiao-Hsuan Chang

Monitoring environmental conditions is an important application of cyber-physical systems. Typically, the monitoring is to perceive surrounding environments with battery-powered, tiny devices deployed in the field. While deep learning-based methods, especially the convolutional neural networks (CNNs), are promising approaches to enriching the functionalities offered by the tiny devices, they demand more computation and memory resources, which makes these methods difficult to be adopted on such devices. In this article, we develop a software framework, RAP , that permits the construction of the CNN designs by aggregating the existing, lightweight CNN layers, which are able to fit in the limited memory (e.g., several KBs of SRAM) on the resource-constrained devices satisfying application-specific timing constrains. RAP leverages the Python-based neural network framework Chainer to build the CNNs by mounting the C/C++ implementations of the lightweight layers, trains the built CNN models as the ordinary model-training procedure in Chainer, and generates the C version codes of the trained models. The generated programs are compiled into target machine executables for the on-device inferences. With the vigorous development of lightweight CNNs, such as binarized neural networks with binary weights and activations, RAP facilitates the model building process for the resource-constrained devices by allowing them to alter, debug, and evaluate the CNN designs over the C/C++ implementation of the lightweight CNN layers. We have prototyped the RAP framework and built two environmental monitoring applications for protecting endangered species using image- and acoustic-based monitoring methods. Our results show that the built model consumes less than 0.5 KB of SRAM for buffering the runtime data required by the model inference while achieving up to 93% of accuracy for the acoustic monitoring with less than one second of inference time on the TI 16-bit microcontroller platform.


2021 ◽  
Author(s):  
Sandra Hernandez ◽  
Jose Araujo ◽  
Patric Jensfelt ◽  
Ioannis Karagiannis ◽  
Ananya Muddukrishna ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document