SciPost logo

SciPost Submission Page

A general learning scheme for classical and quantum Ising machines

by Ludwig Schmid, Enrico Zardini, Davide Pastorello

This is not the latest submitted version.

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): Ludwig Schmid
Submission information
Preprint Link: scipost_202311_00042v1  (pdf)
Code repository: https://github.com/lsschmid/ising-learning-model
Data repository: https://doi.org/10.5281/zenodo.10031307
Date submitted: 2023-11-27 08:19
Submitted by: Schmid, Ludwig
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • Quantum Physics
Approaches: Theoretical, Computational

Abstract

An Ising machine is any hardware specifically designed for finding the ground state of the Ising model. Relevant examples are coherent Ising machines and quantum annealers. In this paper, we propose a new machine learning model that is based on the Ising structure and can be efficiently trained using gradient descent. We provide a mathematical characterization of the training process, which is based upon optimizing a loss function whose partial derivatives are not explicitly calculated but estimated by the Ising machine itself. Moreover, we present some experimental results on the training and execution of the proposed learning model. These results point out new possibilities offered by Ising machines for different learning tasks. In particular, in the quantum realm, the quantum resources are used for both the execution and the training of the model, providing a promising perspective in quantum machine learning.

Current status:
Has been resubmitted

Reports on this Submission

Anonymous Report 1 on 2024-1-31 (Invited Report)

  • Cite as: Anonymous, Report on arXiv:scipost_202311_00042v1, delivered 2024-01-30, doi: 10.21468/SciPost.Report.8477

Strengths

1- The paper introduces an innovative machine learning model centered around Ising machines, showcasing a new perspective in the field.

2-The report is well-written and structured, effectively conveying complex concepts in a clear and comprehensible manner.

3-The proposed model demonstrates versatility by addressing tasks such as function approximation and binary classification.

Weaknesses

1- The report acknowledges the lack of a comprehensive performance evaluation, including statistical repetitions and comparisons to alternative models. A more thorough evaluation would strengthen the robustness of the findings.

2- While the paper provides a proof of concept, the deferral of statistical analysis to future work leaves some uncertainty about the model's performance and generalizability.

3- The paper touches on the model's capabilities in tasks such as function approximation and binary classification but does not extensively explore its variability or limitations in handling more complex scenarios.

4- The report could benefit from a discussion on the computational cost associated with the proposed model, especially when using quantum annealers, providing a more comprehensive perspective for readers.

Report

The paper, titled "A General Learning Scheme for Classical and Quantum Ising Machines," delves into an innovative machine learning model centered around Ising machines. Drawing inspiration from the training of Boltzmann machines, the authors establish a supervised learning model for Ising machines. This model demonstrates trainability through gradient descent on a mean squared error loss function.

The approach introduces a universal neural framework, where data is represented by spins' biases, and parameters act as weights between spins, applicable to both classical and quantum machines. In this model, the traditional backpropagation step for partial derivatives calculation is substituted by the Ising machine's computation of ground state E0 and the corresponding spin configuration z ∗.

Experimental results using a D-Wave quantum annealer showcase the model's prowess, addressing tasks such as function approximation and binary classification. Simulated annealing and quantum annealing serve as Ising machines, illustrating the model's successful training on uncomplicated datasets.

The paper concludes by exploring theoretical questions about the Ising model's expressibility and proposing practical avenues for enhancing the model's training using diverse machine learning tools.

While the paper provides a proof of concept for the model's capabilities, a comprehensive performance evaluation involving statistical repetitions and comparisons to alternative models is deferred for future work.

In summary, the paper introduces a new method for parametric learning through Ising machines, particularly quantum annealers, in the realm of general learning tasks.

This approach invites further exploration, as the authors emphasize potential applications and comparisons with other Ising machine-based models.

Requested changes

(Optional) The report could benefit from a discussion on the computational cost associated with the proposed model, especially when using quantum annealers, providing a more comprehensive perspective for readers.

  • validity: good
  • significance: good
  • originality: good
  • clarity: high
  • formatting: excellent
  • grammar: excellent

Login to report or comment