Functional encryption with application to machine learning: simple conversions from generic functions to quadratic functions

2020 ◽  
Vol 13 (6) ◽  
pp. 2334-2341 ◽  
Author(s):  
Huige Wang ◽  
Kefei Chen ◽  
Yuan Zhang ◽  
Yunlei Zhao
2021 ◽  
Vol 2021 (1) ◽  
pp. 21-42
Author(s):  
Miguel Ambrona ◽  
Dario Fiore ◽  
Claudio Soriente

AbstractIn a Functional Encryption scheme (FE), a trusted authority enables designated parties to compute specific functions over encrypted data. As such, FE promises to break the tension between industrial interest in the potential of data mining and user concerns around the use of private data. FE allows the authority to decide who can compute and what can be computed, but it does not allow the authority to control which ciphertexts can be mined. This issue was recently addressed by Naveed et al., that introduced so-called Controlled Functional encryption (or C-FE), a cryptographic framework that extends FE and allows the authority to exert fine-grained control on the ciphertexts being mined. In this work we extend C-FE in several directions. First, we distribute the role of (and the trust in) the authority across several parties by defining multi-authority C-FE (or mCFE). Next, we provide an efficient instantiation that enables computation of quadratic functions on inputs provided by multiple data-owners, whereas previous work only provides an instantiation for linear functions over data supplied by a single data-owner and resorts to garbled circuits for more complex functions. Our scheme leverages CCA2 encryption and linearly-homomorphic encryption. We also implement a prototype and use it to showcase the potential of our instantiation.


Geophysics ◽  
2020 ◽  
Vol 85 (6) ◽  
pp. R477-R492 ◽  
Author(s):  
Bingbing Sun ◽  
Tariq Alkhalifah

Full-waveform inversion (FWI) is a nonlinear optimization problem, and a typical optimization algorithm such as the nonlinear conjugate gradient or limited-memory Broyden-Fletcher-Goldfarb-Shanno (LBFGS) would iteratively update the model mainly along the gradient-descent direction of the misfit function or a slight modification of it. Based on the concept of meta-learning, rather than using a hand-designed optimization algorithm, we have trained the machine (represented by a neural network) to learn an optimization algorithm, entitled the “ML-descent,” and apply it in FWI. Using a recurrent neural network (RNN), we use the gradient of the misfit function as the input, and the hidden states in the RNN incorporate the history information of the gradient similar to an LBFGS algorithm. However, unlike the fixed form of the LBFGS algorithm, the machine-learning (ML) version evolves in response to the gradient. The loss function for training is formulated as a weighted summation of the L2 norm of the data residuals in the original inverse problem. As with any well-defined nonlinear inverse problem, the optimization can be locally approximated by a linear convex problem; thus, to accelerate the training, we train the neural network by minimizing randomly generated quadratic functions instead of performing time-consuming FWIs. To further improve the accuracy and robustness, we use a variational autoencoder that projects and represents the model in latent space. We use the Marmousi and the overthrust examples to demonstrate that the ML-descent method shows faster convergence and outperforms conventional optimization algorithms. The energy in the deeper part of the models can be recovered by the ML-descent even when the pseudoinverse of the Hessian is not incorporated in the FWI update.


Author(s):  
Tilen Marc ◽  
Miha Stopar ◽  
Jan Hartman ◽  
Manca Bizjak ◽  
Jolanda Modic

Author(s):  
Milad Bahadori ◽  
Kimmo Järvinen ◽  
Tilen Marc ◽  
Miha Stopar

Functional encryption is a new paradigm for encryption where decryption does not give the entire plaintext but only some function of it. Functional encryption has great potential in privacy-enhancing technologies but suffers from excessive computational overheads. We introduce the first hardware accelerator that supports functional encryption for quadratic functions. Our accelerator is implemented on a reprogrammable system-on-chip following the hardware/software codesign methogology. We benchmark our implementation for two privacy-preserving machine learning applications: (1) classification of handwritten digits from the MNIST database and (2) classification of clothes images from the Fashion MNIST database. In both cases, classification is performed with encrypted images. We show that our implementation offers speedups of over 200 times compared to a published software implementation and permits applications which are unfeasible with software-only solutions.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Man-Wai Mak ◽  
Jen-Tzung Chien

Sign in / Sign up

Export Citation Format

Share Document