scholarly journals Evaluation of parameterized quantum circuits: on the relation between classification accuracy, expressibility, and entangling capability

2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Thomas Hubregtsen ◽  
Josef Pichlmeier ◽  
Patrick Stecher ◽  
Koen Bertels

AbstractAn active area of investigation in the search for quantum advantage is quantum machine learning. Quantum machine learning, and parameterized quantum circuits in a hybrid quantum-classical setup in particular, could bring advancements in accuracy by utilizing the high dimensionality of the Hilbert space as feature space. But is the ability of a quantum circuit to uniformly address the Hilbert space a good indicator of classification accuracy? In our work, we use methods and quantifications from prior art to perform a numerical study in order to evaluate the level of correlation. We find a moderate to strong correlation between the ability of the circuit to uniformly address the Hilbert space and the achieved classification accuracy for circuits that entail a single embedding layer followed by 1 or 2 circuit designs. This is based on our study encompassing 19 circuits in both 1- and 2-layer configurations, evaluated on 9 datasets of increasing difficulty. We also evaluate the correlation between entangling capability and classification accuracy in a similar setup, and find a weak correlation. Future work will focus on evaluating if this holds for different circuit designs.

2018 ◽  
Vol 16 (08) ◽  
pp. 1840006 ◽  
Author(s):  
Davide Ferrari ◽  
Michele Amoretti

Quantum compiling means fast, device-aware implementation of quantum algorithms (i.e. quantum circuits, in the quantum circuit model of computation). In this paper, we present a strategy for compiling IBM Q-aware, low-depth quantum circuits that generate Greenberger–Horne–Zeilinger (GHZ) entangled states. The resulting compiler can replace the QISKit compiler for the specific purpose of obtaining improved GHZ circuits. It is well known that GHZ states have several practical applications, including quantum machine learning. We illustrate our experience in implementing and querying a uniform quantum example oracle based on the GHZ circuit, for solving the classically hard problem of learning parity with noise.


2020 ◽  
Author(s):  
Wenjie Liu ◽  
Ying Zhang ◽  
Zhiliang Deng ◽  
Jiaojiao Zhao ◽  
Lian Tong

Abstract As an emerging field that aims to bridge the gap between human activities and computing systems, human-centered computing (HCC) in cloud, edge, fog has had a huge impact on the artificial intelligence algorithms. The quantum generative adversarial network (QGAN) is considered to be one of the quantum machine learning algorithms with great application prospects, which also should be improved to conform to the human-centered paradigm. The generation process of QGAN is relatively random and the generated model does not conform to the human-centered concept, so it is not quite suitable for real scenarios. In order to solve these problems, a hybrid quantum-classical conditional generative adversarial network (QCGAN) algorithm is proposed, which is a knowledge-driven human-computer interaction computing mode in cloud. The purpose of stabilizing the generation process and the interaction between human and computing process is achieved by inputting conditional information in the generator and discriminator. The generator uses the parameterized quantum circuit with an all-to-all connected topology, which facilitates the tuning of network parameters during the training process. The discriminator uses the classical neural network, which effectively avoids the ”input bottleneck” of quantum machine learning. Finally, the BAS training set is selected to conduct experiment on the quantum cloud computing platform. The result shows that the QCGAN algorithm can effectively converge to the Nash equilibrium point after training and perform human-centered classification generation tasks.


Author(s):  
Bhanu Chander

The basic idea of artificial intelligence and machine learning is that machines have the talent to learn from data, previous experience, and perform the work in future consequences. In the era of the digitalized world which holds big data has long-established machine learning methods consistently with requisite high-quality computational resources in numerous useful and realistic tasks. At the same time, quantum machine learning methods work exponentially faster than their counterparts by making use of quantum mechanics. Through taking advantage of quantum effects such as interference or entanglement, quantum computers can proficiently explain selected issues that are supposed to be tough for traditional machines. Quantum computing is unexpectedly related to that of kernel methods in machine learning. Hence, this chapter provides quantum computation, advance of QML techniques, QML kernel space and optimization, and future work of QML.


2002 ◽  
Vol 13 (07) ◽  
pp. 917-929 ◽  
Author(s):  
HANS-GEORG MATUTTIS ◽  
KURT FISCHER ◽  
NOBUYASU ITO ◽  
MASAMICHI ISHIKAWA

One obstacle in the simulation of quantum circuits is the high dimension of the Hilbert space. Using auxiliary field decompositions known from many-particle simulation, we can transform the mathematical description of the quantum circuit into a combination low-dimensional product states which can be sampled using Monte Carlo techniques. We demonstrate the method using Simon's algorithm for the detection of the period of a function.


2020 ◽  
Vol 3 (1) ◽  
Author(s):  
H. Chen ◽  
L. Wossnig ◽  
S. Severini ◽  
H. Neven ◽  
M. Mohseni

AbstractRecent results have demonstrated the successful applications of quantum-classical hybrid methods to train quantum circuits for a variety of machine learning tasks. A natural question to ask is consequentially whether we can also train such quantum circuits to discriminate quantum data, i.e., perform classification on data stored in form of quantum states. Although quantum mechanics fundamentally forbids deterministic discrimination of non-orthogonal states, we show in this work that it is possible to train a quantum circuit to discriminate such data with a trade-off between minimizing error rates and inconclusiveness rates of the classification tasks. Our approach achieves at the same time a performance which is close to the theoretically optimal values and a generalization ability to previously unseen quantum data. This generalization power hence distinguishes our work from previous circuit optimization results and furthermore provides an example of a quantum machine learning task that has inherently no classical analogue.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Karol Bartkiewicz ◽  
Clemens Gneiting ◽  
Antonín Černoch ◽  
Kateřina Jiráková ◽  
Karel Lemr ◽  
...  

2021 ◽  
Vol 251 ◽  
pp. 03070
Author(s):  
Vasilis Belis ◽  
Samuel González-Castillo ◽  
Christina Reissel ◽  
Sofia Vallecorsa ◽  
Elías F. Combarro ◽  
...  

We have developed two quantum classifier models for the ttH classification problem, both of which fall into the category of hybrid quantumclassical algorithms for Noisy Intermediate Scale Quantum devices (NISQ). Our results, along with other studies, serve as a proof of concept that Quantum Machine Learning (QML) methods can have similar or better performance, in specific cases of low number of training samples, with respect to conventional ML methods even with a limited number of qubits available in current hardware. To utilise algorithms with a low number of qubits — to accommodate for limitations in both simulation hardware and real quantum hardware — we investigated different feature reduction methods. Their impact on the performance of both the classical and quantum models was assessed. We addressed different implementations of two QML models, representative of the two main approaches to supervised quantum machine learning today: a Quantum Support Vector Machine (QSVM), a kernel-based method, and a Variational Quantum Circuit (VQC), a variational approach.


Author(s):  
Wenjie Liu ◽  
Ying Zhang ◽  
Zhiliang Deng ◽  
Jiaojiao Zhao ◽  
Lian Tong

AbstractAs an emerging field that aims to bridge the gap between human activities and computing systems, human-centered computing (HCC) in cloud, edge, fog has had a huge impact on the artificial intelligence algorithms. The quantum generative adversarial network (QGAN) is considered to be one of the quantum machine learning algorithms with great application prospects, which also should be improved to conform to the human-centered paradigm. The generation process of QGAN is relatively random and the generated model does not conform to the human-centered concept, so it is not quite suitable for real scenarios. In order to solve these problems, a hybrid quantum-classical conditional generative adversarial network (QCGAN) algorithm is proposed, which is a knowledge-driven human–computer interaction computing mode that can be implemented in cloud. The purposes of stabilizing the generation process and realizing the interaction between human and computing process are achieved by inputting artificial conditional information in the generator and discriminator. The generator uses the parameterized quantum circuit with an all-to-all connected topology, which facilitates the tuning of network parameters during the training process. The discriminator uses the classical neural network, which effectively avoids the “input bottleneck” of quantum machine learning. Finally, the BAS training set is selected to conduct experiment on the quantum cloud computing platform. The result shows that the QCGAN algorithm can effectively converge to the Nash equilibrium point after training and perform human-centered classification generation tasks.


2020 ◽  
Vol 13 (5) ◽  
pp. 1020-1030
Author(s):  
Pradeep S. ◽  
Jagadish S. Kallimani

Background: With the advent of data analysis and machine learning, there is a growing impetus of analyzing and generating models on historic data. The data comes in numerous forms and shapes with an abundance of challenges. The most sorted form of data for analysis is the numerical data. With the plethora of algorithms and tools it is quite manageable to deal with such data. Another form of data is of categorical nature, which is subdivided into, ordinal (order wise) and nominal (number wise). This data can be broadly classified as Sequential and Non-Sequential. Sequential data analysis is easier to preprocess using algorithms. Objective: The challenge of applying machine learning algorithms on categorical data of nonsequential nature is dealt in this paper. Methods: Upon implementing several data analysis algorithms on such data, we end up getting a biased result, which makes it impossible to generate a reliable predictive model. In this paper, we will address this problem by walking through a handful of techniques which during our research helped us in dealing with a large categorical data of non-sequential nature. In subsequent sections, we will discuss the possible implementable solutions and shortfalls of these techniques. Results: The methods are applied to sample datasets available in public domain and the results with respect to accuracy of classification are satisfactory. Conclusion: The best pre-processing technique we observed in our research is one hot encoding, which facilitates breaking down the categorical features into binary and feeding it into an Algorithm to predict the outcome. The example that we took is not abstract but it is a real – time production services dataset, which had many complex variations of categorical features. Our Future work includes creating a robust model on such data and deploying it into industry standard applications.


Sign in / Sign up

Export Citation Format

Share Document