Bram Vanhecke, Maarten Van Damme, Jutho Haegeman, Laurens Vanderstraeten, Frank Verstraete
SciPost Phys. Core 4, 004 (2021) ·
published 19 February 2021
|
· pdf
A central primitive in quantum tensor network simulations is the problem of approximating a matrix product state with one of a lower bond dimension. This problem forms the central bottleneck in algorithms for time evolution and for contracting projected entangled pair states. We formulate a tangent-space based variational algorithm to achieve this for uniform (infinite) matrix product states. The algorithm exhibits a favourable scaling of the computational cost, and we demonstrate its usefulness by several examples involving the multiplication of a matrix product state with a matrix product operator.
Mario Collura, Luca Dell'Anna, Timo Felser, Simone Montangero
SciPost Phys. Core 4, 001 (2021) ·
published 2 February 2021
|
· pdf
In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when mapping a neural network, the resulting tensor network is highly constrained and thus the neural network states do in general not deliver the naive expected drastic improvement against the state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model, addressing the lack of a detailed comparison of the expressiveness of these increasingly popular, variational ans\"atze.