Towards privacy-preserving classification in neural networks

Author(s):  
Mehmood Baryalai ◽  
Julian Jang-Jaccard ◽  
Dongxi Liu
Electronics ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 1367
Author(s):  
Raghida El El Saj ◽  
Ehsan Sedgh Sedgh Gooya ◽  
Ayman Alfalou ◽  
Mohamad Khalil

Privacy-preserving deep neural networks have become essential and have attracted the attention of many researchers due to the need to maintain the privacy and the confidentiality of personal and sensitive data. The importance of privacy-preserving networks has increased with the widespread use of neural networks as a service in unsecured cloud environments. Different methods have been proposed and developed to solve the privacy-preserving problem using deep neural networks on encrypted data. In this article, we reviewed some of the most relevant and well-known computational and perceptual image encryption methods. These methods as well as their results have been presented, compared, and the conditions of their use, the durability and robustness of some of them against attacks, have been discussed. Some of the mentioned methods have demonstrated an ability to hide information and make it difficult for adversaries to retrieve it while maintaining high classification accuracy. Based on the obtained results, it was suggested to develop and use some of the cited privacy-preserving methods in applications other than classification.


2021 ◽  
Author(s):  
Sisong Ru ◽  
Bingbing Zhang ◽  
Yixin Jie ◽  
Chi Zhang ◽  
Lingbo Wei ◽  
...  

2020 ◽  
Vol 7 (4) ◽  
pp. 2663-2678 ◽  
Author(s):  
Jialu Chen ◽  
Jun Zhou ◽  
Zhenfu Cao ◽  
Athanasios V. Vasilakos ◽  
Xiaolei Dong ◽  
...  

2021 ◽  
Vol 2022 (1) ◽  
pp. 291-316
Author(s):  
Théo Ryffel ◽  
Pierre Tholoniat ◽  
David Pointcheval ◽  
Francis Bach

Abstract We propose AriaNN, a low-interaction privacy-preserving framework for private neural network training and inference on sensitive data. Our semi-honest 2-party computation protocol (with a trusted dealer) leverages function secret sharing, a recent lightweight cryptographic protocol that allows us to achieve an efficient online phase. We design optimized primitives for the building blocks of neural networks such as ReLU, MaxPool and BatchNorm. For instance, we perform private comparison for ReLU operations with a single message of the size of the input during the online phase, and with preprocessing keys close to 4× smaller than previous work. Last, we propose an extension to support n-party private federated learning. We implement our framework as an extensible system on top of PyTorch that leverages CPU and GPU hardware acceleration for cryptographic and machine learning operations. We evaluate our end-to-end system for private inference between distant servers on standard neural networks such as AlexNet, VGG16 or ResNet18, and for private training on smaller networks like LeNet. We show that computation rather than communication is the main bottleneck and that using GPUs together with reduced key size is a promising solution to overcome this barrier.


Sign in / Sign up

Export Citation Format

Share Document