SciPost Submission Page

Fast counting with tensor networks

by Stefanos Kourtis, Claudio Chamon, Eduardo R. Mucciolo, Andrei E. Ruckenstein

Submission summary

As Contributors: Stefanos Kourtis
Preprint link: scipost_201908_00005v3
Date accepted: 2019-10-31
Date submitted: 2019-10-23 02:00
Submitted by: Kourtis, Stefanos
Submitted to: SciPost Physics
Academic field: Physics
  • Artificial Intelligence
  • Computational Complexity
  • Data Structures and Algorithms
  • Condensed Matter Physics - Computational
  • Quantum Physics
  • Statistical and Soft Matter Physics
Approaches: Theoretical, Computational


We introduce tensor network contraction algorithms for counting satisfying assignments of constraint satisfaction problems (#CSPs). We represent each arbitrary #CSP formula as a tensor network, whose full contraction yields the number of satisfying assignments of that formula, and use graph theoretical methods to determine favorable orders of contraction. We employ our heuristics for the solution of #P-hard counting boolean satisfiability (#SAT) problems, namely monotone #1-in-3SAT and #Cubic-Vertex-Cover, and find that they outperform state-of-the-art solvers by a significant margin.

Ontology / Topics

See full Ontology or Topics database.

Tensor networks

Published as SciPost Phys. 7, 060 (2019)

Author comments upon resubmission

Dear Editor,

We thank you for considering our manuscript. We also thank the referees for their feedback on our work.

The second referee asked us to comment on the reasons for the superiority of our tensor network methods in solving the constraint satisfaction problems (CSPs) studied in this work and whether this superiority is expected to extend to a broad class of problems. Our response is that, at least in part, our tensor network methods benefit from the sparsity (i.e., small number of edges per vertex) of the networks that represent the instances of a CSP, which corresponds to instances with a lower clause-to-variable ratio α. We hence expect our methods to be advantageous for problems that are the hardest when the corresponding networks are sparser. This situation is not uncommon. For example, it encompasses all monotone #1-in-kSAT problems for k>=3. Even the decision counterparts of these problems close to their satisfiability threshold (α~2/3 for monotone 1-in-3SAT) are in fact much harder to solve than 3SAT close to its much higher threshold (α~4.27), due to the phenomenon of shattering discussed in Ref. [62] of the manuscript. This illustrates that our methods are relevant to a broad class of hard problems and a regime that is difficult to access with any other techniques.

We have added a paragraph in the discussion section (3rd paragraph in Sec. 4 of the revised manuscript attached) to address these points and hope that with this our paper is now in publishable shape.

The authors

List of changes

Added paragraph in the discussion section to (i) address how graph sparsity affects the tensor network techniques introduced, and (ii) comment on the broadness of the class of problems accessible to these techniques.

Submission & Refereeing History

You are currently on this page

Resubmission scipost_201908_00005v3 on 23 October 2019

Login to report or comment