SciPost Submission Page
A supervised learning algorithm for interacting topological insulators based on local curvature
by Paolo Molignini, Antonio Zegarra, Evert van Nieuwenburg, R. Chitra, and Wei Chen
Submission summary
As Contributors:  Paolo Molignini 
Preprint link:  scipost_202105_00007v2 
Code repository:  https://gitlab.com/paolo.molignini/interactingtopologicalinsulatorsml 
Date accepted:  20210908 
Date submitted:  20210705 14:42 
Submitted by:  Molignini, Paolo 
Submitted to:  SciPost Physics 
Academic field:  Physics 
Specialties: 

Approaches:  Theoretical, Computational 
Abstract
Topological order in solid state systems is often calculated from the integration of an appropriate curvature function over the entire Brillouin zone. At topological phase transitions where the single particle spectral gap closes, the curvature function diverges and changes sign at certain high symmetry points in the Brillouin zone. These generic properties suggest the introduction of a supervised machine learning scheme that uses only the curvature function at the high symmetry points as input data. We apply this scheme to a variety of interacting topological insulators in different dimensions and symmetry classes. We demonstrate that an artificial neural network trained with the noninteracting data can accurately predict all topological phases in the interacting cases with very little numerical effort. Intriguingly, the method uncovers a ubiquitous interactioninduced topological quantum multicriticality in the examples studied.
Current status:
Editorial decision:
For Journal SciPost Physics: Publish
(status: Editorial decision fixed and (if required) accepted by authors)
Author comments upon resubmission
We kindly thank the Referees for their useful comments on our manuscript and their recommendation for publication. In the following, we address all of their comments explicitly.
First Referee
We thank the Referee for their kind words and appreciation of our work. We would like to also address their comment regarding the location of the gap closures. The Referee correctly points out that, in a generic setting, the gap closures might not be located at the highsymmetry points. This could happen in systems where inversion symmetry is explicitly broken, for instance in the graphenelike example suggested by the Referee, or even in a simpler system such as square lattice with a sublattice potential. The gap closures are possibly not pinned at the highsymmetry points anymore, but instead become "mobile", wandering away from the highsymmetry points as the strength of the inversionsymmetry breaking perturbation is increased.
As it is now, our machine learning scheme has been trained and tested only on systems where the gap closures occur at highsymmetry points. However, there are two main strategies that one could adopt to adapt the algorithm to the more general case of gap closures occurring away from the highsymmetry points.
The first, naive solution is to simply map out the gap closures as a function of the tuning parameter and use their shifted position as the data to feed to the machine learning scheme. A straightforward implementation would incorporate the search for the gap closure point which could be numerically cumbersome. The neural network would have to be retrained with this new shifted data.
An alternative approach would be to rely on the conservation of the topological invariant within each phase. Let us assume that the gap closure is shifted away from the highsymmetry point and there is no additional gap closure occurring between this shifted position and the highsymmetry point. This is generically true for the types of twoband Dirac systems analyzed in our work. Then, approaching a topological phase transition, the curvature function will diverge at this shifted point where the gap closure occurs. At the same time, the area under the curvature function, which represents the topological invariant, must remain conserved because of the discrete nature of the topological invariant. The divergence at the gap closure must therefore suck out the weight of the curvature function from its neighborhood to preserve the area under the curve: in other words, all points in its vicinity will experience a continuous, monotonic change in the value of the curvature function to compensate for the divergence. This change will have a direct correspondence with the value of the topological invariant, and therefore can also be taken as the input data to characterize the topology. In particular, these points can be taken to be highsymmetry points. We remark that this scenario does indeed appear for topological phase transitions associated with frozen dynamics in periodically driven systems [see for instance Phys. Rev. B 102, 235143 (2020) and Phys. Rev. B 98, 125129 (2018)]. In this context, the gap closures occur at nonhighsymmetry points, but by mapping out the behavior of the curvature function at highsymmetry points one can still determine the correct phase diagram. Because the divergent behavior is consistent across all topological phase transitions (of the same universality), the highsymmetry points can then still be fed to the machine learning scheme as input data. The neural network should automatically learn to associate the change in value at the highsymmetry points with the given topological phase, even though the gap closures occur elsewhere.
Second Referee
We deeply appreciate the Referee’s positive reception of our work and thank them for their recommendation to publish the manuscript. We have fixed all the issues raised by the Referee. In particular:  Following the Referee’s suggestions, we have reformulated several sentences in the abstract and introduction to improve the clarity of our message.  We have reformulated our statement about the “minimal amount of data” required to predict the topology of a given phase. We have also highlighted the simplicity of our training and its generalization rather than the input size.  We have substantially reworked our summary of the training and testing procedure to improve its clarity (note also the reduction from 4 to 3 points).  As requested by the Referee, we have written the explicit formulas for the curvature functions. Note that for the Chern insulator, expanding out the formulas would generate expressions that would occupy an entire page.  We have slightly reworked Figs. 2 and 3 to accommodate the Referee’s requests.  We have also included a link to an opensource repository of the python code that was used to generate the results of the paper.
List of changes
 Slight changes to the abstract and introduction.
 Added more explicit formulas for the curvature functions.
 Slightly changed the appearance of Figs. 2 and 3 (no change in the data).
 Reformulated the summary of the machine learning procedure. It now consists of three main steps and not four.
 Added link to opensource repository.