scholarly journals Correction to “A Unifying Variational Perspective on Some Fundamental Information Theoretic Inequalities”

2016 ◽  
Vol 62 (7) ◽  
pp. 4356-4357
Author(s):  
Sangwoo Park ◽  
Erchin Serpedin ◽  
Khalid Qaraqe
Author(s):  
Matthew Coudron ◽  
Jalex Stark ◽  
Thomas Vidick

AbstractThe generation of certifiable randomness is the most fundamental information-theoretic task that meaningfully separates quantum devices from their classical counterparts. We propose a protocol for exponential certified randomness expansion using a single quantum device. The protocol calls for the device to implement a simple quantum circuit of constant depth on a 2D lattice of qubits. The output of the circuit can be verified classically in linear time, and is guaranteed to contain a polynomial number of certified random bits assuming that the device used to generate the output operated using a (classical or quantum) circuit of sub-logarithmic depth. This assumption contrasts with the locality assumption used for randomness certification based on Bell inequality violation and more recent proposals for randomness certification based on computational assumptions. Furthermore, to demonstrate randomness generation it is sufficient for a device to sample from the ideal output distribution within constant statistical distance. Our procedure is inspired by recent work of Bravyi et al. (Science 362(6412):308–311, 2018), who introduced a relational problem that can be solved by a constant-depth quantum circuit, but provably cannot be solved by any classical circuit of sub-logarithmic depth. We develop the discovery of Bravyi et al. into a framework for robust randomness expansion. Our results lead to a new proposal for a demonstrated quantum advantage that has some advantages compared to existing proposals. First, our proposal does not rest on any complexity-theoretic conjectures, but relies on the physical assumption that the adversarial device being tested implements a circuit of sub-logarithmic depth. Second, success on our task can be easily verified in classical linear time. Finally, our task is more noise-tolerant than most other existing proposals that can only tolerate multiplicative error, or require additional conjectures from complexity theory; in contrast, we are able to allow a small constant additive error in total variation distance between the sampled and ideal distributions.


2019 ◽  
Vol 22 (03) ◽  
pp. 1950013
Author(s):  
OLIVER PFANTE ◽  
NILS BERTSCHINGER

Stochastic volatility models describe stock returns [Formula: see text] as driven by an unobserved process capturing the random dynamics of volatility [Formula: see text]. The present paper quantifies how much information about volatility [Formula: see text] and future stock returns can be inferred from past returns in stochastic volatility models in terms of Shannon’s mutual information. In particular, we show that across a wide class of stochastic volatility models, including a two-factor model, returns observed on the scale of seconds would be needed to obtain reliable volatility estimates. In addition, we prove that volatility forecasts beyond several weeks are essentially impossible for fundamental information theoretic reasons.


2002 ◽  
Vol 48 (8) ◽  
pp. 2377-2383 ◽  
Author(s):  
O.G. Guleryuz ◽  
E. Lutwak ◽  
Deane Yang ◽  
Gaoyong Zhang

1991 ◽  
Vol 37 (6) ◽  
pp. 1501-1518 ◽  
Author(s):  
A. Dembo ◽  
T.M. Cover ◽  
J.A. Thomas

Sign in / Sign up

Export Citation Format

Share Document