Loop Series Expansions for Tensor Networks.

Start Date/Time: 2024-10-23 / 9:00 a.m. (Taipei time) = [10-22 / 18:00 p.m. (PDT)]
End Date/Time: 2024-10-23 / 10:00 a.m.  

Online Zoom Link: https://us02web.zoom.us/j/86867205231?pwd=OTJVTURuVU9FVzkzR01kMVUwcGVvZz09
[Registration] is required

Abstract: 

Belief propagation (BP) can be a useful tool to approximately contract a tensor network, provided that the contributions from any closed loops in the network are sufficiently weak. In this manuscript we describe how a loop series expansion can be applied to systematically improve the accuracy of a BP approximation to a tensor network contraction, in principle converging arbitrarily close to the exact result. More generally, our result provides a framework for expanding a tensor network as a sum of component networks in a hierarchy of increasing complexity. We benchmark this proposal for the contraction of iPEPS, either representing the ground state of an AKLT model or with randomly defined tensors, where it is shown to improve in accuracy over standard BP by several orders of magnitude whilst incurring only a minor increase in computational cost. These results indicate that the proposed series expansions could be a useful tool to accurately evaluate tensor networks in cases that otherwise exceed the limits of established contraction routines.