scholarly journals Variational quantum algorithms for trace distance and fidelity estimation

Author(s):  
Ranyiliu Chen ◽  
Zhixin Song ◽  
Xuanqiang Zhao ◽  
Xin Wang

Abstract Estimating the difference between quantum data is crucial in quantum computing. However, as typical characterizations of quantum data similarity, the trace distance and quantum fidelity are believed to be exponentiallyhard to evaluate in general. In this work, we introduce hybrid quantum-classical algorithms for these two distance measures on near-term quantum devices where no assumption of input state is required. First, we introduce the Variational Trace Distance Estimation (VTDE) algorithm. We in particular provide the technique to extract the desired spectrum information of any Hermitian matrix by local measurement. A novel variational algorithm for trace distance estimation is then derived from this technique, with the assistance of a single ancillary qubit. Notably, VTDE could avoid the barren plateau issue with logarithmic depth circuits due to a local cost function. Second, we introduce the Variational Fidelity Estimation (VFE) algorithm. We combine Uhlmann’s theorem and the freedom in purification to translate the estimation task into an optimization problem over a unitary on an ancillary system with fixed purified inputs. We then provide a purification subroutine to complete the translation. Both algorithms are verified by numerical simulations and experimental implementations, exhibiting high accuracy for randomly generated mixed states.

Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 436
Author(s):  
Ruirui Zhao ◽  
Minxia Luo ◽  
Shenggang Li

Picture fuzzy sets, which are the extension of intuitionistic fuzzy sets, can deal with inconsistent information better in practical applications. A distance measure is an important mathematical tool to calculate the difference degree between picture fuzzy sets. Although some distance measures of picture fuzzy sets have been constructed, there are some unreasonable and counterintuitive cases. The main reason is that the existing distance measures do not or seldom consider the refusal degree of picture fuzzy sets. In order to solve these unreasonable and counterintuitive cases, in this paper, we propose a dynamic distance measure of picture fuzzy sets based on a picture fuzzy point operator. Through a numerical comparison and multi-criteria decision-making problems, we show that the proposed distance measure is reasonable and effective.


Electronics ◽  
2021 ◽  
Vol 10 (14) ◽  
pp. 1690
Author(s):  
Teague Tomesh ◽  
Pranav Gokhale ◽  
Eric R. Anschuetz ◽  
Frederic T. Chong

Many quantum algorithms for machine learning require access to classical data in superposition. However, for many natural data sets and algorithms, the overhead required to load the data set in superposition can erase any potential quantum speedup over classical algorithms. Recent work by Harrow introduces a new paradigm in hybrid quantum-classical computing to address this issue, relying on coresets to minimize the data loading overhead of quantum algorithms. We investigated using this paradigm to perform k-means clustering on near-term quantum computers, by casting it as a QAOA optimization instance over a small coreset. We used numerical simulations to compare the performance of this approach to classical k-means clustering. We were able to find data sets with which coresets work well relative to random sampling and where QAOA could potentially outperform standard k-means on a coreset. However, finding data sets where both coresets and QAOA work well—which is necessary for a quantum advantage over k-means on the entire data set—appears to be challenging.


2014 ◽  
Vol 21 (04) ◽  
pp. 1450010
Author(s):  
Toru Fuda

By carrying out appropriate continuous quantum measurements with a family of projection operators, a unitary channel can be approximated in an arbitrary precision in the trace norm sense. In particular, the quantum Zeno effect is described as an application. In the case of an infinite dimension, although the von Neumann entropy is not necessarily continuous, the difference of the entropies between the states, as mentioned above, can be made arbitrarily small under some conditions.


Author(s):  
H. A. Fitzhugh

As we contemplate the challenge of feeding more than 8 billion people —more than three quarters living in developing countries —the even greater challenge will be feeding their grandchildren. Consideration of competition between livestock and mankind for nutrients must include both near-term food needs and long-term sustainability of agricultural production systems. Producing more livestock products at the expense of eroding the natural resource base is not an acceptable solution. Livestock have been denigrated as both competitors for food and degraders of the natural resource base for food production. These often emotionally argued allegations against livestock generally do not stand up to objective analysis. Livestock arc most often complementary elements of food production systems, converting otherwise unused feed sources to highly desired food and livestock products such as leather and wool. Moreover, well-managed livestock are positive contributors to the natural resources base supporting balanced agricultural systems. In this chapter, the following points are addressed from the perspective of current and future role for livestock in feeding 8 billion people: . . . • Growing demands for human food and livestock feed • Domesticated food-producing animals • World livestock production systems • Human food preferences and requirements • Dietary requirements and conversion efficiencies • Contributions of science to livestock improvement . . . The overarching issue is the difference in the current and future role for livestock in developed and in developing regions. Less than 11 percent of the global land mass of 13.3 billion hectares is cultivated; the remainder supports permanent pasture, 26%; forest, 31%; and other nonagricultural uses, 32% (U.N. data as cited by Waggoner, 1994). The concerns about competition between livestock and mankind for nutrients center primarily on grains and legumes grown on arable land. Even the most avid vegetarians have little taste for the forages and other herbaceous materials from pasturelands, forests, roadsides, and fence rows that arc consumed by livestock. Since the 18th century, the amount of land cultivated has increased from approximately 0.3 to 1.5 billion ha (Richards, 1990, as cited by Waggoner, 1994). This increase in cultivated land has primarily come at the expense of forest and grasslands.


2020 ◽  
Vol 125 (12) ◽  
Author(s):  
B. Foxen ◽  
C. Neill ◽  
A. Dunsworth ◽  
P. Roushan ◽  
B. Chiaro ◽  
...  
Keyword(s):  

2020 ◽  
Vol 8 ◽  
Author(s):  
Hai-Ping Cheng ◽  
Erik Deumens ◽  
James K. Freericks ◽  
Chenglong Li ◽  
Beverly A. Sanders

Chemistry is considered as one of the more promising applications to science of near-term quantum computing. Recent work in transitioning classical algorithms to a quantum computer has led to great strides in improving quantum algorithms and illustrating their quantum advantage. Because of the limitations of near-term quantum computers, the most effective strategies split the work over classical and quantum computers. There is a proven set of methods in computational chemistry and materials physics that has used this same idea of splitting a complex physical system into parts that are treated at different levels of theory to obtain solutions for the complete physical system for which a brute force solution with a single method is not feasible. These methods are variously known as embedding, multi-scale, and fragment techniques and methods. We review these methods and then propose the embedding approach as a method for describing complex biochemical systems, with the parts not only treated with different levels of theory, but computed with hybrid classical and quantum algorithms. Such strategies are critical if one wants to expand the focus to biochemical molecules that contain active regions that cannot be properly explained with traditional algorithms on classical computers. While we do not solve this problem here, we provide an overview of where the field is going to enable such problems to be tackled in the future.


2020 ◽  
Vol 2020 (8) ◽  
Author(s):  
Ali Mollabashi ◽  
Kotaro Tamaoka

Abstract We study odd entanglement entropy (odd entropy in short), a candidate of measure for mixed states holographically dual to the entanglement wedge cross section, in two-dimensional free scalar field theories. Our study is restricted to Gaussian states of scale-invariant theories as well as their finite temperature generalizations, for which we show that odd entropy is a well-defined measure for mixed states. Motivated from holographic results, the difference between odd and von Neumann entropy is also studied. In particular, we show that large amounts of quantum correlations ensure the odd entropy to be larger than von Neumann entropy, which is qualitatively consistent with the holographic CFT. In general cases, we also find that this difference is not even a monotonic function with respect to size of (and distance between) subsystems.


2017 ◽  
Vol 114 (46) ◽  
pp. 12338-12343 ◽  
Author(s):  
Noah Scovronick ◽  
Mark B. Budolfson ◽  
Francis Dennig ◽  
Marc Fleurbaey ◽  
Asher Siebert ◽  
...  

Future population growth is uncertain and matters for climate policy: higher growth entails more emissions and means more people will be vulnerable to climate-related impacts. We show that how future population is valued importantly determines mitigation decisions. Using the Dynamic Integrated Climate-Economy model, we explore two approaches to valuing population: a discounted version of total utilitarianism (TU), which considers total wellbeing and is standard in social cost of carbon dioxide (SCC) models, and of average utilitarianism (AU), which ignores population size and sums only each time period’s discounted average wellbeing. Under both approaches, as population increases the SCC increases, but optimal peak temperature decreases. The effect is larger under TU, because it responds to the fact that a larger population means climate change hurts more people: for example, in 2025, assuming the United Nations (UN)-high rather than UN-low population scenario entails an increase in the SCC of 85% under TU vs. 5% under AU. The difference in the SCC between the two population scenarios under TU is comparable to commonly debated decisions regarding time discounting. Additionally, we estimate the avoided mitigation costs implied by plausible reductions in population growth, finding that large near-term savings ($billions annually) occur under TU; savings under AU emerge in the more distant future. These savings are larger than spending shortfalls for human development policies that may lower fertility. Finally, we show that whether lowering population growth entails overall improvements in wellbeing—rather than merely cost savings—again depends on the ethical approach to valuing population.


2021 ◽  
Vol 8 ◽  
Author(s):  
Hirotaka Osawa ◽  
Atsushi Kawagoe ◽  
Eisuke Sato ◽  
Takuya Kato

The authors evaluate the extent to which a user’s impression of an AI agent can be improved by giving the agent the ability of self-estimation, thinking time, and coordination of risk tendency. The authors modified the algorithm of an AI agent in the cooperative game Hanabi to have all of these traits, and investigated the change in the user’s impression by playing with the user. The authors used a self-estimation task to evaluate the effect that the ability to read the intention of a user had on an impression. The authors also show thinking time of an agent influences impression for an agent. The authors also investigated the relationship between the concordance of the risk-taking tendencies of players and agents, the player’s impression of agents, and the game experience. The results of the self-estimation task experiment showed that the more accurate the estimation of the agent’s self, the more likely it is that the partner will perceive humanity, affinity, intelligence, and communication skills in the agent. The authors also found that an agent that changes the length of thinking time according to the priority of action gives the impression that it is smarter than an agent with a normal thinking time when the player notices the difference in thinking time or an agent that randomly changes the thinking time. The result of the experiment regarding concordance of the risk-taking tendency shows that influence player’s impression toward agents. These results suggest that game agent designers can improve the player’s disposition toward an agent and the game experience by adjusting the agent’s self-estimation level, thinking time, and risk-taking tendency according to the player’s personality and inner state during the game.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 492
Author(s):  
Philippe Suchsland ◽  
Francesco Tacchino ◽  
Mark H. Fischer ◽  
Titus Neupert ◽  
Panagiotis Kl. Barkoutsos ◽  
...  

We present a hardware agnostic error mitigation algorithm for near term quantum processors inspired by the classical Lanczos method. This technique can reduce the impact of different sources of noise at the sole cost of an increase in the number of measurements to be performed on the target quantum circuit, without additional experimental overhead. We demonstrate through numerical simulations and experiments on IBM Quantum hardware that the proposed scheme significantly increases the accuracy of cost functions evaluations within the framework of variational quantum algorithms, thus leading to improved ground-state calculations for quantum chemistry and physics problems beyond state-of-the-art results.


Sign in / Sign up

Export Citation Format

Share Document