SciPost Submission Page
Learning tensor networks with tensor cross interpolation: new algorithms and libraries
by Yuriel Núñez Fernández, Marc K. Ritter, Matthieu Jeannin, Jheng-Wei Li, Thomas Kloss, Thibaud Louvet, Satoshi Terasaki, Olivier Parcollet, Jan von Delft, Hiroshi Shinaoka, Xavier Waintal
Submission summary
Authors (as registered SciPost users): | Yuriel Núñez Fernández |
Submission information | |
---|---|
Preprint Link: | https://arxiv.org/abs/2407.02454v2 (pdf) |
Code repository: | http://tensor4all.org |
Date submitted: | 2024-07-23 15:18 |
Submitted by: | Núñez Fernández, Yuriel |
Submitted to: | SciPost Physics |
Ontological classification | |
---|---|
Academic field: | Physics |
Specialties: |
|
Approach: | Computational |
Abstract
The tensor cross interpolation (TCI) algorithm is a rank-revealing algorithm for decomposing low-rank, high-dimensional tensors into tensor trains/matrix product states (MPS). TCI learns a compact MPS representation of the entire object from a tiny training data set. Once obtained, the large existing MPS toolbox provides exponentially fast algorithms for performing a large set of operations. We discuss several improvements and variants of TCI. In particular, we show that replacing the cross interpolation by the partially rank-revealing LU decomposition yields a more stable and more flexible algorithm than the original algorithm. We also present two open source libraries, xfac in Python/C++ and TensorCrossInterpolation.jl in Julia, that implement these improved algorithms, and illustrate them on several applications. These include sign-problem-free integration in large dimension, the superhigh-resolution quantics representation of functions, the solution of partial differential equations, the superfast Fourier transform, the computation of partition functions, and the construction of matrix product operators.
Author indications on fulfilling journal expectations
- Provide a novel and synergetic link between different research areas.
- Open a new pathway in an existing or a new research direction, with clear potential for multi-pronged follow-up work
- Detail a groundbreaking theoretical/experimental/computational discovery
- Present a breakthrough on a previously-identified and long-standing research stumbling block
Current status:
Reports on this Submission
Strengths
1. Detailed and thorough introduction to tensor cross interpolation and its applications.
2. Conceptually new partial rank-revealing LU decomposition and explanation of relation to Shur's complement.
3. Examples with source code.
Weaknesses
1. Some sections, like 4.5.1 CI-canonicalization, are very technical and might be difficult to parse upon first reading.
Report
The paper provides a mathematical introduction and exposition to tensor cross interpolation (building upon previous works of the authors) and refines TCI by introducing the partial rank-revealing LU decomposition. The usefulness and applicability of the method are exemplified by various examples, including source code. A particular highlight is the superfast Fourier transformation. The paper is interesting and relevant for scientific computing in general, since e.g., high-dimensional data interpolation appears in various contexts. The paper is well-written, although somewhat technical in some places. I have listed some minor remarks and suggestions below. In summary, I deem the article meets the journal's acceptance criteria and recommend publication.
Requested changes
Minor remarks and suggestions:
1. In section "3.2.4 Relation with self-energy:" I've found the statement "Eq. (24) can be proven by the trivial identity..." quite hard to follow; isn't the identification with Eq. (15) sufficient - or could you provide one more step for the derivation?
2. You could illustrate the nesting conditions in section 4.2 by an example.
3. Before Eq. (33a): the expression "a zero-dimensional slice" might be unclear.
4. After Eq. (43): briefly explain why the fermionic algebra leads to only two non-zero elements.
5. On page 21 there is a typo: "the pivots fully..." -> "the pivots are fully..."
6. CI-canonicalization: can you motivate why converting an MPS to TCI form is advantageous or worthwhile? (Since TCI is a kind of MPS, I was naively expecting that an MPS representation is already sufficient.)
7. In Table 1, the constant prefactors 3 can probably be omitted from the asymptotic complexity - or is there a specific reason to keep them in the table?
8. In Fig. 5, I'd suggest using different color schemes for the beta and L parameters to avoid confusion (e.g., using the same color but different brightness levels to distinguish the three beta values).
Recommendation
Publish (easily meets expectations and criteria for this Journal; among top 50%)