Indistinguishability Obfuscation from Simple-to-State Hard Problems: New Assumptions, New Techniques, and Simplification

Author(s):  
Romain Gay ◽  
Aayush Jain ◽  
Huijia Lin ◽  
Amit Sahai
2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Huige Wang ◽  
Kefei Chen ◽  
Tianyu Pan ◽  
Yunlei Zhao

Functional encryption (FE) can implement fine-grained control to encrypted plaintext via permitting users to compute only some specified functions on the encrypted plaintext using private keys with respect to those functions. Recently, many FEs were put forward; nonetheless, most of them cannot resist chosen-ciphertext attacks (CCAs), especially for those in the secret-key settings. This changed with the work, i.e., a generic transformation of public-key functional encryption (PK-FE) from chosen-plaintext (CPA) to chosen-ciphertext (CCA), where the underlying schemes are required to have some special properties such as restricted delegation or verifiability features. However, examples for such underlying schemes with these features have not been found so far. Later, a CCA-secure functional encryption from projective hash functions was proposed, but their scheme only applies to inner product functions. To construct such a scheme, some nontrivial techniques will be needed. Our key contribution in this work is to propose CCA-secure functional encryptions in the PKE and SK environment, respectively. In the existing generic transformation from (adaptively) simulation-based CPA- (SIM-CPA-) secure ones for deterministic functions to (adaptively) simulation-based CCA- (SIM-CCA-) secure ones for randomized functions, whether the schemes were directly applied to CCA settings for deterministic functions is not implied. We give an affirmative answer and derive a SIM-CCA-secure scheme for deterministic functions by making some modifications on it. Again, based on this derived scheme, we also propose an (adaptively) indistinguishable CCA- (IND-CCA-) secure SK-FE for deterministic functions. The final results show that our scheme can be instantiated under both nonstandard assumptions (e.g., hard problems on multilinear maps and indistinguishability obfuscation (IO)) and under standard assumptions (e.g., DDH, RSA, LWE, and LPN).


Algorithms ◽  
2020 ◽  
Vol 13 (6) ◽  
pp. 146
Author(s):  
Andreas Emil Feldmann ◽  
Karthik C. Karthik C. S. ◽  
Euiwoong Lee ◽  
Pasin Manurangsi

Parameterization and approximation are two popular ways of coping with NP-hard problems. More recently, the two have also been combined to derive many interesting results. We survey developments in the area both from the algorithmic and hardness perspectives, with emphasis on new techniques and potential future research directions.


2004 ◽  
Vol 11 (9) ◽  
Author(s):  
Ivan B. Damgård ◽  
Serge Fehr ◽  
Louis Salvail

The concept of zero-knowledge (ZK) has become of fundamental importance in cryptography. However, in a setting where entities are modeled by quantum computers, classical arguments for proving ZK fail to hold since, in the quantum setting, the concept of rewinding is not generally applicable. Moreover, known classical techniques that avoid rewinding have various shortcomings in the quantum setting.<br /> <br />We propose new techniques for building <em>quantum</em> zero-knowledge (QZK) protocols, which remain secure even under (active) quantum attacks. We obtain computational QZK proofs and perfect QZK arguments for any NP language in the common reference string model. This is based on a general method converting an important class of classical honest-verifier ZK (HVZK) proofs into QZK proofs. This leads to quite practical protocols if the underlying HVZK proof is efficient. These are the first proof protocols enjoying these properties, in particular the first to achieve perfect QZK.<br /> <br />As part of our construction, we propose a general framework for building unconditionally hiding (trapdoor) string commitment schemes, secure against quantum attacks, as well as concrete instantiations based on specific (believed to be) hard problems. This is of independent interest, as these are the first unconditionally hiding string commitment schemes withstanding quantum attacks.<br /> <br />Finally, we give a partial answer to the question whether QZK is possible in the plain model. We propose a new notion of QZK, <em>non-oblivious verifier</em> QZK, which is strictly stronger than honest-verifier QZK but weaker than full QZK, and we show that this notion can be achieved by means of efficient (quantum) protocols.


1962 ◽  
Vol 11 (02) ◽  
pp. 137-143
Author(s):  
M. Schwarzschild

It is perhaps one of the most important characteristics of the past decade in astronomy that the evolution of some major classes of astronomical objects has become accessible to detailed research. The theory of the evolution of individual stars has developed into a substantial body of quantitative investigations. The evolution of galaxies, particularly of our own, has clearly become a subject for serious research. Even the history of the solar system, this close-by intriguing puzzle, may soon make the transition from being a subject of speculation to being a subject of detailed study in view of the fast flow of new data obtained with new techniques, including space-craft.


Author(s):  
M.A. Parker ◽  
K.E. Johnson ◽  
C. Hwang ◽  
A. Bermea

We have reported the dependence of the magnetic and recording properties of CoPtCr recording media on the thickness of the Cr underlayer. It was inferred from XRD data that grain-to-grain epitaxy of the Cr with the CoPtCr was responsible for the interaction observed between these layers. However, no cross-sectional TEM (XTEM) work was performed to confirm this inference. In this paper, we report the application of new techniques for preparing XTEM specimens from actual magnetic recording disks, and for layer-by-layer micro-diffraction with an electron probe elongated parallel to the surface of the deposited structure which elucidate the effect of the crystallographic structure of the Cr on that of the CoPtCr.XTEM specimens were prepared from magnetic recording disks by modifying a technique used to prepare semiconductor specimens. After 3mm disks were prepared per the standard XTEM procedure, these disks were then lapped using a tripod polishing device. A grid with a single 1mmx2mm hole was then glued with M-bond 610 to the polished side of the disk.


Author(s):  
P. Pradère ◽  
J.F. Revol ◽  
R. St. John Manley

Although radiation damage is the limiting factor in HREM of polymers, new techniques based on low dose imaging at low magnification have permitted lattice images to be obtained from very radiation sensitive polymers such as polyethylene (PE). This paper describes the computer averaging of P4MP1 lattice images. P4MP1 is even more sensitive than PE (total end point dose of 27 C m-2 as compared to 100 C m-2 for PE at 120 kV). It does, however, have the advantage of forming flat crystals from dilute solution and no change in d-spacings is observed during irradiation.Crystals of P4MP1 were grown at 60°C in xylene (polymer concentration 0.05%). Electron microscopy was performed with a Philips EM 400 T microscope equipped with a Low Dose Unit and operated at 120 kV. Imaging conditions were the same as already described elsewhere. Enlarged micrographs were digitized and processed with the Spider image processing system.


Author(s):  
Antonia M. Milroy

In recent years many new techniques and instruments for 3-Dimensional visualization of electron microscopic images have become available. Higher accelerating voltage through thicker sections, photographed at a tilt for stereo viewing, or the use of confocal microscopy, help to analyze biological material without the necessity of serial sectioning. However, when determining the presence of neurotransmitter receptors or biochemical substances present within the nervous system, the need for good serial sectioning (Fig. 1+2) remains. The advent of computer assisted reconstruction and the possibility of feeding information from the specimen viewing chamber directly into a computer via a camera mounted on the electron microscope column, facilitates serial analysis. Detailed information observed at the subcellular level is more precise and extensive and the complexities of interactions within the nervous system can be further elucidated.We emphasize that serial ultra thin sectioning can be performed routinely and consistently in multiple user electron microscopy laboratories. Initial tissue fixation and embedding must be of high quality.


2000 ◽  
Vol 10 (2) ◽  
pp. 4-5
Author(s):  
Robin A. Samlan ◽  
Paul W. Flint ◽  
Celia Bassich-Zeren
Keyword(s):  

1994 ◽  
Vol 27 (3) ◽  
pp. 487-510 ◽  
Author(s):  
William Hal Martin ◽  
John W. Schwegler ◽  
Audrey L. Gleeson ◽  
Yong-Bing Shi

Sign in / Sign up

Export Citation Format

Share Document