SciPost logo

SciPost Submission Page

A supervised learning algorithm for interacting topological insulators based on local curvature

by Paolo Molignini, Antonio Zegarra, Evert van Nieuwenburg, R. Chitra, and Wei Chen

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): Paolo Molignini · Everard van Nieuwenburg
Submission information
Preprint Link: scipost_202105_00007v2  (pdf)
Code repository: https://gitlab.com/paolo.molignini/interacting-topological-insulators-ml
Date accepted: 2021-09-08
Date submitted: 2021-07-05 14:42
Submitted by: Molignini, Paolo
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • Condensed Matter Physics - Theory
  • Condensed Matter Physics - Computational
Approaches: Theoretical, Computational

Abstract

Topological order in solid state systems is often calculated from the integration of an appropriate curvature function over the entire Brillouin zone. At topological phase transitions where the single particle spectral gap closes, the curvature function diverges and changes sign at certain high symmetry points in the Brillouin zone. These generic properties suggest the introduction of a supervised machine learning scheme that uses only the curvature function at the high symmetry points as input data. We apply this scheme to a variety of interacting topological insulators in different dimensions and symmetry classes. We demonstrate that an artificial neural network trained with the noninteracting data can accurately predict all topological phases in the interacting cases with very little numerical effort. Intriguingly, the method uncovers a ubiquitous interaction-induced topological quantum multicriticality in the examples studied.

Author comments upon resubmission

We kindly thank the Referees for their useful comments on our manuscript and their recommendation for publication. In the following, we address all of their comments explicitly.

First Referee

We thank the Referee for their kind words and appreciation of our work. We would like to also address their comment regarding the location of the gap closures. The Referee correctly points out that, in a generic setting, the gap closures might not be located at the high-symmetry points. This could happen in systems where inversion symmetry is explicitly broken, for instance in the graphene-like example suggested by the Referee, or even in a simpler system such as square lattice with a sublattice potential. The gap closures are possibly not pinned at the high-symmetry points anymore, but instead become "mobile", wandering away from the high-symmetry points as the strength of the inversion-symmetry breaking perturbation is increased.

As it is now, our machine learning scheme has been trained and tested only on systems where the gap closures occur at high-symmetry points. However, there are two main strategies that one could adopt to adapt the algorithm to the more general case of gap closures occurring away from the high-symmetry points. 

The first, naive solution is to simply map out the gap closures as a function of the tuning parameter and use their shifted position as the data to feed to the machine learning scheme. A straightforward implementation would incorporate the search for the gap closure point which could be numerically cumbersome. The neural network would have to be retrained with this new shifted data.

An alternative approach would be to rely on the conservation of the topological invariant within each phase. Let us assume that the gap closure is shifted away from the high-symmetry point and there is no additional gap closure occurring between this shifted position and the high-symmetry point. This is generically true for the types of two-band Dirac systems analyzed in our work. Then, approaching a topological phase transition, the curvature function will diverge at this shifted point where the gap closure occurs. At the same time, the area under the curvature function, which represents the topological invariant, must remain conserved because of the discrete nature of the topological invariant. The divergence at the gap closure must therefore suck out the weight of the curvature function from its neighborhood to preserve the area under the curve: in other words, all points in its vicinity will experience a continuous, monotonic change in the value of the curvature function to compensate for the divergence. This change will have a direct correspondence with the value of the topological invariant, and therefore can also be taken as the input data to characterize the topology. In particular, these points can be taken to be high-symmetry points. We remark that this scenario does indeed appear for topological phase transitions associated with frozen dynamics in periodically driven systems [see for instance Phys. Rev. B 102, 235143 (2020) and Phys. Rev. B 98, 125129 (2018)]. In this context, the gap closures occur at non-high-symmetry points, but by mapping out the behavior of the curvature function at high-symmetry points one can still determine the correct phase diagram. Because the divergent behavior is consistent across all topological phase transitions (of the same universality), the high-symmetry points can then still be fed to the machine learning scheme as input data. The neural network should automatically learn to associate the change in value at the high-symmetry points with the given topological phase, even though the gap closures occur elsewhere.

Second Referee

We deeply appreciate the Referee’s positive reception of our work and thank them for their recommendation to publish the manuscript. We have fixed all the issues raised by the Referee. In particular: - Following the Referee’s suggestions, we have reformulated several sentences in the abstract and introduction to improve the clarity of our message. - We have reformulated our statement about the “minimal amount of data” required to predict the topology of a given phase. We have also highlighted the simplicity of our training and its generalization rather than the input size. - We have substantially reworked our summary of the training and testing procedure to improve its clarity (note also the reduction from 4 to 3 points). - As requested by the Referee, we have written the explicit formulas for the curvature functions. Note that for the Chern insulator, expanding out the formulas would generate expressions that would occupy an entire page. - We have slightly reworked Figs. 2 and 3 to accommodate the Referee’s requests. - We have also included a link to an open-source repository of the python code that was used to generate the results of the paper.

List of changes

- Slight changes to the abstract and introduction.
- Added more explicit formulas for the curvature functions.
- Slightly changed the appearance of Figs. 2 and 3 (no change in the data).
- Reformulated the summary of the machine learning procedure. It now consists of three main steps and not four.
- Added link to open-source repository.

Published as SciPost Phys. 11, 073 (2021)

Login to report or comment