scholarly journals Randomized Benchmarking as Convolution: Fourier Analysis of Gate Dependent Errors

Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 581
Author(s):  
Seth T. Merkel ◽  
Emily J. Pritchett ◽  
Bryan H. Fong

We show that the Randomized Benchmarking (RB) protocol is a convolution amenable to Fourier space analysis. By adopting the mathematical framework of Fourier transforms of matrix-valued functions on groups established in recent work from Gowers and Hatami \cite{GH15}, we provide an alternative proof of Wallman's \cite{Wallman2018} and Proctor's \cite{Proctor17} bounds on the effect of gate-dependent noise on randomized benchmarking. We show explicitly that as long as our faulty gate-set is close to the targeted representation of the Clifford group, an RB sequence is described by the exponential decay of a process that has exactly two eigenvalues close to one and the rest close to zero. This framework also allows us to construct a gauge in which the average gate-set error is a depolarizing channel parameterized by the RB decay rates, as well as a gauge which maximizes the fidelity with respect to the ideal gate-set.

Author(s):  
E. Voelkl ◽  
L. F. Allard

The conventional discrete Fourier transform can be extended to a discrete Extended Fourier transform (EFT). The EFT allows to work with discrete data in close analogy to the optical bench, where continuous data are processed. The EFT includes a capability to increase or decrease the resolution in Fourier space (thus the argument that CCD cameras with a higher number of pixels to increase the resolution in Fourier space is no longer valid). Fourier transforms may also be shifted with arbitrary increments, which is important in electron holography. Still, the analogy between the optical bench and discrete optics on a computer is limited by the Nyquist limit. In this abstract we discuss the capability with the EFT to change the initial sampling rate si of a recorded or simulated image to any other(final) sampling rate sf.


2017 ◽  
Vol 83 (4) ◽  
Author(s):  
J. Guadagni ◽  
A. J. Cerfon

We present a fast and spectrally accurate numerical scheme for the evaluation of the gyroaveraged electrostatic potential in non-periodic gyrokinetic-Poisson simulations. Our method relies on a reformulation of the gyrokinetic-Poisson system in which the gyroaverage in Poisson’s equation is computed for the compactly supported charge density instead of the non-periodic, non-compactly supported potential itself. We calculate this gyroaverage with a combination of two Fourier transforms and a Hankel transform, which has the near optimal run-time complexity$O(N_{\unicode[STIX]{x1D70C}}(P+\hat{P})\log (P+\hat{P}))$, where$P$is the number of spatial grid points,$\hat{P}$the number of grid points in Fourier space and$N_{\unicode[STIX]{x1D70C}}$the number of grid points in velocity space. We present numerical examples illustrating the performance of our code and demonstrating geometric convergence of the error.


Author(s):  
Robert J Marks II

In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to the outcome of a random experiment. We can, for example, flip a coin and assign an outcome of a heads as X = 1 and a tails X = 0. Often the number is equated to the numerical outcome of the experiment, such as the number of dots on the face of a rolled die or the measurement of a voltage in a noisy circuit. The cumulative distribution function is defined by FX(x) = Pr[X ≤ x]. (4.1) The probability density function is the derivative fX(x) = d /dxFX(x). Our treatment of random variables focuses on use of Fourier analysis. Due to this viewpoint, the development we use is unconventional and begins immediately in the next section with discussion of properties of the probability density function.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 2-2 ◽  
Author(s):  
A J Ahumada

Letting external noise rather than internal noise limit discrimination performance allows information to be extracted about the observer's stimulus classification rule. A perceptual classification image is the correlation over trials between the noise amplitude at a spatial location and the observer's responses. If, for example, the observer followed the rule of the ideal observer, the response correlation image would be an estimate of the ideal observer filter, the difference between the two unmasked images being discriminated. Perceptual classification images were estimated for a Vernier discrimination task. The display screen had 48 pixels deg−1 horizontally and vertically. The no-offset image had a dark horizontal line of 4 pixels, a 1 pixel space, and 4 more dark pixels. Classification images were based on 1600 discrimination trials with the line contrast adjusted to keep the error rate near 25%. In the offset image, the second line was one pixel higher. Unlike the ideal observer filter (a horizontal dipole), the observer perceptual classification images are strongly oriented. Fourier transforms of the classification images had a peak amplitude near 1 cycle deg−1 and an orientation near 25 deg. The spatial spread is much more than image blur predicts, and probably indicates the spatial position uncertainty in the task.


1987 ◽  
Vol 65 (2) ◽  
pp. 456-457 ◽  
Author(s):  
Norman Gee ◽  
Gordon R. Freeman ◽  
A. V. Anantaraman

In the recent work of Anantaraman (Can. J. Chem. 64, 46 (1986)) the conversion from kinematic viscosity to dynamic viscosity was incorrect. We calculated correct values from the tabulated data. We also give corrected values of the ideal mixture viscosity, the Grunberg–Nissan deviation parameter, and the Hind interaction parameter.


2017 ◽  
Vol 112 (2) ◽  
pp. 219-230 ◽  
Author(s):  
JOEL E. LANDIS

Recent work by party scholars reveals a widening gap between the normative ideals we set out for political parties and the empirical evidence that reveals their deep and perhaps insurmountable shortcomings in realizing these ideals. This disjunction invites us to consider the perspective of David Hume, who offers a theory of the value and proper function of parties that is resilient to the pessimistic findings of recent empirical scholarship. I analyze Hume's writings to show that the psychological experience of party informs the opinions by which governments can be considered legitimate. Hume thus invites us to consider the essential role parties might play in securing legitimacy as that ideal is practiced or understood by citizens, independent of the ideal understandings of legitimacy currently being articulated by theorists. My analysis contributes to both recent party scholarship and to our understanding of the role of parties in Hume's theory of allegiance.


1994 ◽  
Vol 7 (1) ◽  
pp. 43-59 ◽  
Author(s):  
Christine M. Koggel

Affirmative action generates so much controversy that very often proponents and opponents both fail to understand the other’s position. A recent work by Michel Rosenfeld convincingly argues that the incommensurability of the opposing sides is based on fundamental disagreements about the meaning of such concepts as equality and justice: “the affirmative action debate is not between persons who are ‘pro-equality’ and others who are ‘anti-equality’. Both the most ardent advocates of affirmative action and its most vehement foes loudly proclaim their allegiance to the ideal of equality.” Within a liberal framework, two conceptions of equality are commonly defended—formal and substantive equality of opportunity. Both conceptions assume background conditions of the scarcity of goods, a need to compete for educational, social and economic benefits, and the value of rewards for fair competition as a means to individual self-development and self-realization. In the first section, I outline each conception briefly, summarize the sorts of affirmative action each defends, and show that the irreconcilability of the opposing sides is ultimately grounded in different conceptions of the self. I then go on to argue that both conceptions limit our understanding of selves and ultimately constrain attempts to achieve equality in a context in which individuals are also members of groups with identities formed in historical contexts of discrimination.


2020 ◽  
Vol 1 (1) ◽  
pp. 4
Author(s):  
Carlos R. Baiz

Fourier transforms (FT) are universal in chemistry, physics, and biology. Despite FTs being a core component of multiple experimental techniques, undergraduate courses typically approach FTs from a mathematical perspective, leaving students with a lack of intuition on how an FT works. Here, I introduce interactive teaching tools for upper-level undergraduate courses and describe a practical lesson plan for FTs. The materials include a computer program to capture video from a webcam and display the original images side-by-side with the corresponding plot in the Fourier domain. Several patterns are included to be printed on paper and held up to the webcam as input. During the lesson, students are asked to predict the features observed in the FT and then place the patterns in front of the webcam to test their predictions. This interactive approach enables students with limited mathematical skills to achieve a certain level of intuition for how FTs translate patterns from real space into the corresponding Fourier space.


1975 ◽  
Vol 12 (2) ◽  
pp. 110-116
Author(s):  
V. Krishnan

Ever since the exposition of Schwartz on theory of distributions, the acceptance of Fourier methods for functions hitherto not amenable to rigorous Fourier analysis (e.g., impulse functions) has become an established fact. The introduction of the concepts of functions of ‘slow growth’ and ‘rapid decay’ provides a reinterpretation of the classical Fourier analysis.


Sign in / Sign up

Export Citation Format

Share Document