scholarly journals Guest Editorial: Robust Resource-Constrained Systems for Machine Learning

2020 ◽  
Vol 37 (2) ◽  
pp. 5-7
Author(s):  
Theocharis Theocharides ◽  
Muhammad Shafique ◽  
Jungwook Choi ◽  
Onur Mutlu
2021 ◽  
Vol 18 (1) ◽  
pp. 775-779
Author(s):  
Nur Zincir-Heywood ◽  
Giuliano Casale ◽  
David Carrera ◽  
Lydia Y. Chen ◽  
Amogh Dhamdhere ◽  
...  

2016 ◽  
Vol 75 ◽  
pp. 06005 ◽  
Author(s):  
Artem Yushev ◽  
Manuel Schappacher ◽  
Axel Sikora

Author(s):  
Ahmed Imteaj ◽  
M. Hadi Amini

Federated Learning (FL) is a recently invented distributed machine learning technique that allows available network clients to perform model training at the edge, rather than sharing it with a centralized server. Unlike conventional distributed machine learning approaches, the hallmark feature of FL is to allow performing local computation and model generation on the client side, ultimately protecting sensitive information. Most of the existing FL approaches assume that each FL client has sufficient computational resources and can accomplish a given task without facing any resource-related issues. However, if we consider FL for a heterogeneous Internet of Things (IoT) environment, a major portion of the FL clients may face low resource availability (e.g., lower computational power, limited bandwidth, and battery life). Consequently, the resource-constrained FL clients may give a very slow response, or may be unable to execute expected number of local iterations. Further, any FL client can inject inappropriate model during a training phase that can prolong convergence time and waste resources of all the network clients. In this paper, we propose a novel tri-layer FL scheme, Federated Proximal, Activity and Resource-Aware 31 Lightweight model (FedPARL), that reduces model size by performing sample-based pruning, avoids misbehaved clients by examining their trust score, and allows partial amount of work by considering their resource-availability. The pruning mechanism is particularly useful while dealing with resource-constrained FL-based IoT (FL-IoT) clients. In this scenario, the lightweight training model will consume less amount of resources to accomplish a target convergence. We evaluate each interested client's resource-availability before assigning a task, monitor their activities, and update their trust scores based on their previous performance. To tackle system and statistical heterogeneities, we adapt a re-parameterization and generalization of the current state-of-the-art Federated Averaging (FedAvg) algorithm. The modification of FedAvg algorithm allows clients to perform variable or partial amounts of work considering their resource-constraints. We demonstrate that simultaneously adapting the coupling of pruning, resource and activity awareness, and re-parameterization of FedAvg algorithm leads to more robust convergence of FL in IoT environment.


Sign in / Sign up

Export Citation Format

Share Document