SciPost Submission Page
Time-evolution of local information: thermalization dynamics of local observables
by Thomas Klein Kvorning, Loïc Herviou, Jens H. Bardarson
This is not the latest submitted version.
|As Contributors:||Jens H Bardarson · Thomas Klein Kvorning|
|Arxiv Link:||https://arxiv.org/abs/2105.11206v2 (pdf)|
|Date submitted:||2022-06-15 15:29|
|Submitted by:||Klein Kvorning, Thomas|
|Submitted to:||SciPost Physics|
Quantum many-body dynamics generically results in increasing entanglement that eventually leads to thermalization of local observables. This makes the exact description of the dynamics complex despite the apparent simplicity of (high-temperature) thermal states. For accurate but approximate simulations one needs a way to keep track of essential (quantum) information while discarding inessential one. To this end, we first introduce the concept of the information lattice, which supplements the physical spatial lattice with an additional dimension and where a local Hamiltonian gives rise to well defined locally conserved von Neumann information current. This provides a convenient and insightful way of capturing the flow, through time and space, of information during quantum time evolution, and gives a distinct signature of when local degrees of freedom decouple from long-range entanglement. As an example, we describe such decoupling of local degrees of freedom for the mixed field transverse Ising model. Building on this, we secondly construct algorithms to time-evolve sets of local density matrices without any reference to a global state. With the notion of information currents, we can motivate algorithms based on the intuition that information for statistical reasons flow from small to large scales. Using this guiding principle, we construct an algorithm that, at worst, shows two-digit convergence in time-evolutions up to very late times for diffusion process governed by the mixed field transverse Ising Hamiltonian. While we focus on dynamics in 1D with nearest-neighbor Hamiltonians, the algorithms do not essentially rely on these assumptions and can in principle be generalized to higher dimensions and more complicated Hamiltonians.
Author comments upon resubmission
We thank you for organizing the review of our manuscript. We apologize for the delay in resubmitting our updated manuscript and the answers to the referees.
Since both referees complained about the accessibility of our manuscript, we have undertaken a significant rewriting of the article, particularly the first parts. We have significantly shortened the introduction, and instead of introducing background material in a separate section, we now rather introduce concepts when they are needed. We believe this has improved the paper's readability significantly and hope that the referees agree with this.
In addition, the first referee objected to publication based on what we think is a misunderstanding of the goal and contents of our paper. Our paper achieves two things: i) it introduces a way of separating quantum information into different scales—in what we call the information lattice—that gives a much more refined picture of the time evolution of quantum information than does, say, the entanglement entropy. Using this, we have analyzed generic quantum dynamics governed by a thermalizing Hamiltonian. And ii) using the insights from i), we suggest a new numerical algorithm that captures thermalizing dynamics by astutely throwing away quantum information that does not affect local observables. We show that this algorithm is competitive with the best available algorithms attempting to solve the same problem.
Given the above, we believe that our work makes significant progress on an open problem in the field of quantum many-body dynamics: how to simulate thermalizing dynamics for long times, given that thermal states have much less information than typical pure quantum states. There is an extensive literature trying to solve this problem in the last years (see references in our paper), and the referee seems to have missed this point. Instead, the objections to our algorithm are based on fine-tuned examples that obviously can not be captured by our algorithm, nor any other algorithms that attempt to simulate thermalizing dynamics to late times. It should be clear that no classical algorithm can capture all of the quantum information that is in a pure state time evolved for a long time unless it is for tiny systems that can be dealt with using exact diagonalization. In any case, we believe the above arguments should clarify that our paper satisfies both acceptance criteria 2 and 3 of SciPost Physics.
The second referee's main objection was on our choice of model since it may work better than expected for the matrix product state time-dependent variational principle. We have taken this seriously and have produced new data for a different model, directly comparing our results with the recent work arXiv:2004.05177, obtaining agreeing results (see details in answer to Ref. 2). We think this closes the worry that our data is somehow fine-tuned. Since it will not significantly add to the paper and anyway will be publicly available, we have decided to include the new data only in the response to the referee to avoid taking up much extra space in an already long article.
With these changes and our responses to the referees, we hope our manuscript can now be accepted for publication in SciPost Physics.
Thomas Klein Kvorning
Jens H Bardarson
List of changes
— Rewrote introduction
— Rewrote section 2
— Smaller changes and typo corrections throughout the manuscript.
— We also made several changes to our notation and nomenclature. The most major of these is that we removed the phrase local equilibrium since, as the second referee points out, our use of it can be confusing.
Submission & Refereeing History
You are currently on this page
Reports on this Submission
Anonymous Report 2 on 2022-6-20 (Invited Report)
The authors have revised their manuscript, significantly clarifying its presentation. In their response, they have also provided additional data, which addresses concerns of fine-tuning I had of the original version. Given this, I believe their work provides a significant step forward in the long-standing problem of simulating the dynamics of quantum thermalization and is therefore appropriate for publication in SciPost Physics.
Anonymous Report 1 on 2022-6-18 (Invited Report)
The reply, does not address my main concerns.
For example, my first point (1) was that it is unclear why a decay of information flow between different length scales would imply an uncoupling of the corresponding dynamics. If we observe that mutual information for small subsystems equilibrates, why should this imply a decoupling of the local dynamics from that on larger length scales?
The authors' answer reiterates the argument that, if information (flow) vanishes at a scale L, one could obtain the time derivative of the density matrices at scale L-1 without reference to larger scales. That is exactly the point in question. The information currents are just some scalars. Their vanishing does in itself not guarantee an uncoupling of dynamics (density operators). Yes, it would imply that the density-matrix time derivative as determined from the maximum entropy principle would not depend on larger-scale correlations, but under what constraints is the maximum entropy principle applicable in this way? The information currents alone do not provide a justification. While the approach may work under certain constraints, such an essential aspect of the proposal requires a more detailed reasoning and discussion of the required conditions.
The authors' state that the two simple counter examples given in my point (2) would not be covered by the method. One can easily come up further scenarios. What is missing is a criterion that tells us when the results of the proposed method are trustworthy. When truncating the hierarchy at distance L, what kind of control do we have on the error introduced due to that truncation? How does one decide whether, say, a certain quantum quench falls into the considered class of "typical time-evolutions". One needs a corresponding framework to make the approach predictive.
BTW, "quantum revivals" are not limited to finite systems. And no, I do not suggest to abandon the second law of thermodynamics. On the other hand, it certainly does not mean that entropy would strictly increase under all circumstances or provide a derivation for the suggested method.
My point (4) on the N-representability problem is discarded in the reply. Sure, if the local density operators correspond to a global Gibbs state (with a small deviation) than there is no question. But generally, we don't know when/if that condition is met. So generally, the method will lead to non-representable local density matrices (nonphysical states) - especially, if we have no criterion on the effect of the truncation.
In comment (5), I pointed out that observing L=6,7,8 results get gradually closer to the L=9 results does not imply that the dynamics is converged or even quasi-exact. Anything else would surely be troubling, but nothing assures us that the L=9 results are precise. This is quite different from MPS simulations, where the truncation error gives rigorous bounds on approximation errors, and the dynamics becomes exact for (very) large bond dimensions. To assess the accuracy, it seems imperative to compare against alternative quasi-exact methods.
There is still a fair number of grammatical mistakes, missing commas etc.