SciPost Submission Page
Tensor network representations from the geometry of entangled states
by Matthias Christandl, Angelo Lucia, Péter Vrana, Albert H. Werner
|As Contributors:||Angelo Lucia · Albert H. Werner|
|Arxiv Link:||https://arxiv.org/abs/1809.08185v3 (pdf)|
|Date submitted:||2020-09-07 15:24|
|Submitted by:||Werner, Albert H.|
|Submitted to:||SciPost Physics|
|Subject area:||Quantum Physics|
Tensor network states provide successful descriptions of strongly correlated quantum systems with applications ranging from condensed matter physics to cosmology. Any family of tensor network states possesses an underlying entanglement structure given by a graph of maximally entangled states along the edges that identify the indices of the tensors to be contracted. Recently, more general tensor networks have been considered, where the maximally entangled states on edges are replaced by multipartite entangled states on plaquettes. Both the structure of the underlying graph and the dimensionality of the entangled states influence the computational cost of contracting these networks. Using the geometrical properties of entangled states, we provide a method to construct tensor network representations with smaller effective bond dimension. We illustrate our method with the resonating valence bond state on the kagome lattice.
For Journal SciPost Physics: Publish
(status: Editorial decision fixed and (if required) accepted by authors)
Author comments upon resubmission
Based on this feedback, we have uploaded a new version to the arxiv, which we hereby resubmit to SciPost.
A detailed list of changes is described below.
List of changes
Requested changes Report 1
1. A few small typos: pg 10, line 2 "representable with by" ->
"representable by", pg 14, last paragraph of section 4 "back to RVB state" -> "back to the RVB state", pg 19, below 3rd displayed equation a , should be omitted and logs -> log,
pg 25 line 2 of section 6 "provide" -> "provides".
Thank you very much for spotting these, we have taken all of them into account.
2. The discussion in section 5.2 was hard to understand (at least to this reviewer). Perhaps the structure of the argument can be made a bit more clear?
We have re-written part of this section, including more explanations about the method - in particular at the beginning of the section and hope that the argument is now clearer.
3. Something I wondered while reading is whether the construction in
Theorem 14 is optimal in any sense, or whether for combining multiple
plaquettes there could be better degenerations than the tensor
product degeneration of single plaquette degenerations. If this
question makes sense, perhaps the authors could make a remark on the
possibilities for this. Included comment after theorem
This is a very good observation and indeed it could be possible that better conversions can be obtained on larger patches. We have included a comment regarding this after the proof of Theorem 14 on page 13.
Submission & Refereeing History
You are currently on this page
Reports on this Submission
Anonymous Report 1 on 2020-9-10 Invited Report
Thanks, that is all good!