scholarly journals Petz reconstruction in random tensor networks

2020 ◽  
Vol 2020 (10) ◽  
Author(s):  
Hewei Frederic Jia ◽  
Mukund Rangamani

Abstract We illustrate the ideas of bulk reconstruction in the context of random tensor network toy models of holography. Specifically, we demonstrate how the Petz reconstruction map works to obtain bulk operators from the boundary data by exploiting the replica trick. We also take the opportunity to comment on the differences between coarse-graining and random projections.

2018 ◽  
Vol 97 (12) ◽  
Author(s):  
Goffredo Chirco ◽  
Daniele Oriti ◽  
Mingyi Zhang

Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 410
Author(s):  
Johnnie Gray ◽  
Stefanos Kourtis

Tensor networks represent the state-of-the-art in computational methods across many disciplines, including the classical simulation of quantum many-body systems and quantum circuits. Several applications of current interest give rise to tensor networks with irregular geometries. Finding the best possible contraction path for such networks is a central problem, with an exponential effect on computation time and memory footprint. In this work, we implement new randomized protocols that find very high quality contraction paths for arbitrary and large tensor networks. We test our methods on a variety of benchmarks, including the random quantum circuit instances recently implemented on Google quantum chips. We find that the paths obtained can be very close to optimal, and often many orders or magnitude better than the most established approaches. As different underlying geometries suit different methods, we also introduce a hyper-optimization approach, where both the method applied and its algorithmic parameters are tuned during the path finding. The increase in quality of contraction schemes found has significant practical implications for the simulation of quantum many-body systems and particularly for the benchmarking of new quantum chips. Concretely, we estimate a speed-up of over 10,000× compared to the original expectation for the classical simulation of the Sycamore `supremacy' circuits.


2020 ◽  
Vol 8 (1) ◽  
Author(s):  
Adam Jermyn

The evaluation of partition functions is a central problem in statistical physics. For lattice systems and other discrete models the partition function may be expressed as the contraction of a tensor network. Unfortunately computing such contractions is difficult, and many methods to make this tractable require periodic or otherwise structured networks. Here I present a new algorithm for contracting unstructured tensor networks. This method makes no assumptions about the structure of the network and performs well in both structured and unstructured cases so long as the correlation structure is local.


2019 ◽  
Vol 100 (13) ◽  
Author(s):  
Romain Vasseur ◽  
Andrew C. Potter ◽  
Yi-Zhuang You ◽  
Andreas W. W. Ludwig

2018 ◽  
Vol 175 ◽  
pp. 11015
Author(s):  
Hikaru Kawauchi ◽  
Shinji Takeda

The phase structure of the two dimensional lattice CP(1) model in the presence of the θ term is analyzed by tensor network methods. The tensor renormalization group, which is a standard renormalization method of tensor networks, is used for the regions θ = 0 and θ ≠ 0. Loop-TNR, which is more suitable for the analysis of near criticality, is also implemented for the region θ = 0. The application of Loop-TNR for the region θ ≠ 0 is left for future work.


Universe ◽  
2019 ◽  
Vol 5 (10) ◽  
pp. 211 ◽  
Author(s):  
Goffredo Chirco

This work is meant as a review summary of a series of recent results concerning the derivation of a holographic entanglement entropy formula for generic open spin network states in the group field theory (GFT) approach to quantum gravity. The statistical group-field computation of the Rényi entropy for a bipartite network state for a simple interacting GFT is reviewed, within a recently proposed dictionary between group field theories and random tensor networks, and with an emphasis on the problem of a consistent characterisation of the entanglement entropy in the GFT second quantisation formalism.


2017 ◽  
Vol 2017 (8) ◽  
Author(s):  
Xiao-Liang Qi ◽  
Zhao Yang ◽  
Yi-Zhuang You

Author(s):  
Ian Convy ◽  
William Huggins ◽  
Haoran Liao ◽  
K Birgitta Whaley

Abstract Tensor networks have emerged as promising tools for machine learning, inspired by their widespread use as variational ansatze in quantum many-body physics. It is well known that the success of a given tensor network ansatz depends in part on how well it can reproduce the underlying entanglement structure of the target state, with different network designs favoring different scaling patterns. We demonstrate here how a related correlation analysis can be applied to tensor network machine learning, and explore whether classical data possess correlation scaling patterns similar to those found in quantum states which might indicate the best network to use for a given dataset. We utilize mutual information as measure of correlations in classical data, and show that it can serve as a lower-bound on the entanglement needed for a probabilistic tensor network classifier. We then develop a logistic regression algorithm to estimate the mutual information between bipartitions of data features, and verify its accuracy on a set of Gaussian distributions designed to mimic different correlation patterns. Using this algorithm, we characterize the scaling patterns in the MNIST and Tiny Images datasets, and find clear evidence of boundary-law scaling in the latter. This quantum-inspired classical analysis offers insight into the design of tensor networks which are best suited for specific learning tasks.


2012 ◽  
Vol 12 (3&4) ◽  
pp. 346-354
Author(s):  
Joseph M. Landsburg ◽  
Yang Qi ◽  
Ke Ye

We answer a question of L. Grasedyck that arose in quantum information theory, showing that the limit of tensors in a space of tensor network states need not be a tensor network state. We also give geometric descriptions of spaces of tensor networks states corresponding to trees and loops. Grasedyck's question has a surprising connection to the area of Geometric Complexity Theory, in that the result is equivalent to the statement that the boundary of the Mulmuley-Sohoni type variety associated to matrix multiplication is strictly larger than the projections of matrix multiplication (and re-expressions of matrix multiplication and its projections after changes of bases). Tensor Network States are also related to graphical models in algebraic statistics.


2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Boris Ponsioen ◽  
Fakher Assaad ◽  
Philippe Corboz

The excitation ansatz for tensor networks is a powerful tool for simulating the low-lying quasiparticle excitations above ground states of strongly correlated quantum many-body systems. Recently, the two-dimensional tensor network class of infinite projected entangled-pair states gained new ground state optimization methods based on automatic differentiation, which are at the same time highly accurate and simple to implement. Naturally, the question arises whether these new ideas can also be used to optimize the excitation ansatz, which has recently been implemented in two dimensions as well. In this paper, we describe a straightforward way to reimplement the framework for excitations using automatic differentiation, and demonstrate its performance for the Hubbard model at half filling.


Sign in / Sign up

Export Citation Format

Share Document