# Quantum verification of NP problems with single photons and linear optics

2021 ◽
Vol 10 (1) ◽
Author(s):
Aonan Zhang ◽
Hao Zhan ◽
Junjie Liao ◽
Kaimin Zheng ◽
Tao Jiang ◽
...
Keyword(s):

AbstractQuantum computing is seeking to realize hardware-optimized algorithms for application-related computational tasks. NP (nondeterministic-polynomial-time) is a complexity class containing many important but intractable problems like the satisfiability of potentially conflict constraints (SAT). According to the well-founded exponential time hypothesis, verifying an SAT instance of size n requires generally the complete solution in an O(n)-bit proof. In contrast, quantum verification algorithms, which encode the solution into quantum bits rather than classical bit strings, can perform the verification task with quadratically reduced information about the solution in $$\tilde O(\sqrt n )$$ O ̃ ( n ) qubits. Here we realize the quantum verification machine of SAT with single photons and linear optics. By using tunable optical setups, we efficiently verify satisfiable and unsatisfiable SAT instances and achieve a clear completeness-soundness gap even in the presence of experimental imperfections. The protocol requires only unentangled photons, linear operations on multiple modes and at most two-photon joint measurements. These features make the protocol suitable for photonic realization and scalable to large problem sizes with the advances in high-dimensional quantum information manipulation and large scale linear-optical systems. Our results open an essentially new route toward quantum advantages and extend the computational capability of optical quantum computing.

2004 ◽
Author(s):
Selim Shahriar
Keyword(s):

2021 ◽
Vol 113 (4) ◽
pp. 252-258
Author(s):
A. I. Galimov ◽
M. V. Rakhlin ◽
G. V. Klimko ◽
Yu. A. Guseva ◽
...
Keyword(s):

2004 ◽
Author(s):
J. D. Franson
Keyword(s):

2019 ◽
Author(s):
Elizabeth Behrman ◽
Nam Nguyen ◽
James Steck
Keyword(s):

<p>Noise and decoherence are two major obstacles to the implementation of large-scale quantum computing. Because of the no-cloning theorem, which says we cannot make an exact copy of an arbitrary quantum state, simple redundancy will not work in a quantum context, and unwanted interactions with the environment can destroy coherence and thus the quantum nature of the computation. Because of the parallel and distributed nature of classical neural networks, they have long been successfully used to deal with incomplete or damaged data. In this work, we show that our model of a quantum neural network (QNN) is similarly robust to noise, and that, in addition, it is robust to decoherence. Moreover, robustness to noise and decoherence is not only maintained but improved as the size of the system is increased. Noise and decoherence may even be of advantage in training, as it helps correct for overfitting. We demonstrate the robustness using entanglement as a means for pattern storage in a qubit array. Our results provide evidence that machine learning approaches can obviate otherwise recalcitrant problems in quantum computing. </p> <p> </p>

2007 ◽
Vol 18 (11) ◽
pp. 34
Author(s):
Zhen-Sheng Yuan ◽
Yu-Ao Chen ◽
Shuai Chen ◽
Jian-Wei Pan
Keyword(s):

Keyword(s):

This chapter suggests how individual netizens or companies can uncover “pushing hand” operations. It is vitally important that Internet users, either corporations or individuals should acquire some knowledge and skills in identifying Internet mercenary marketing schemes since unrestricted information manipulation has grown to such a large scale that it led to a media claim that 70% of visits to the Chinese Internet derived from pushing hand operations. Evaluating information and deciding whether it is in fact a genuine recommendation from netizens or managed information from pushing hands is not an easy task. Several clues of online information evaluation are provided.

2021 ◽
pp. 142-185
Author(s):
Andrew V. Z. Brower ◽
Randall T. Schuh

This chapter evaluates “quantitative cladistics” in detail, including the issues of fit, parsimony algorithms, and character weighting. Although systematists have long associated characters with taxa, the relationship between character data and “phylogeny” has not always been obvious. The ideas of Willi Hennig clarified this relationship, and the formalization of these concepts in a quantitative method, via the parsimony criterion, allowed for computer implementation of phylogenetic inference and the feasible solution of previously intractable problems. It is this computational capability that took the study of taxonomic relationships from an almost purely qualitative and speculative enterprise to one dominated by the use of computer software and “objective” methodologies. The chapter then discusses the use, advantages, and disadvantages of maximum likelihood and Bayesian techniques as alternative approaches to the application of parsimony.

Author(s):
Bruce Levinson
Keyword(s):

2019 ◽
Vol 56 (10) ◽
pp. 9-10
Author(s):
Mark Anderson
Keyword(s):

2020 ◽
Vol 4 (3) ◽
pp. 24
Author(s):
Noah Cowper ◽
Harry Shaw ◽
David Thayer
Keyword(s):

The ability to send information securely is a vital aspect of today’s society, and with the developments in quantum computing, new ways to communicate have to be researched. We explored a novel application of quantum key distribution (QKD) and synchronized chaos which was utilized to mask a transmitted message. This communication scheme is not hampered by the ability to send single photons and consequently is not vulnerable to number splitting attacks like other QKD schemes that rely on single photon emission. This was shown by an eavesdropper gaining a maximum amount of information on the key during the first setup and listening to the key reconciliation to gain more information. We proved that there is a maximum amount of information an eavesdropper can gain during the communication, and this is insufficient to decode the message.