SciPost logo

SciPost Submission Page

Parameter-Parallel Distributed Variational Quantum Algorithm

by Yun-Fei Niu, Shuo Zhang, Chen Ding, Wan-Su Bao, He-Liang Huang

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): He-Liang Huang
Submission information
Preprint Link: scipost_202301_00022v1  (pdf)
Date accepted: 2023-03-30
Date submitted: 2023-01-16 13:52
Submitted by: Huang, He-Liang
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • Quantum Algebra
  • Quantum Physics
Approach: Theoretical

Abstract

Variational quantum algorithms (VQAs) have emerged as a promising near-term technique to explore practical quantum advantage on noisy intermediate-scale quantum (NISQ) devices. However, the inefficient parameter training process due to the incompatibility with backpropagation and the cost of a large number of measurements, posing a great challenge to the large-scale development of VQAs. Here, we propose a parameter-parallel distributed variational quantum algorithm (PPD-VQA), to accelerate the training process by parameter-parallel training with multiple quantum processors. To maintain the high performance of PPD-VQA in the realistic noise scenarios, a alternate training strategy is proposed to alleviate the acceleration attenuation caused by noise differences among multiple quantum processors, which is an unavoidable common problem of distributed VQA. Besides, the gradient compression is also employed to overcome the potential communication bottlenecks. The achieved results suggest that the PPD-VQA could provide a practical solution for coordinating multiple quantum processors to handle large-scale real-word applications.

Author comments upon resubmission

We have merged the reply document and the revised manuscript into one document (due to the fact that our reply document has some figures in it, which is not supported by the SciPost submission system website for filling in). Therefore, all responses are in the file we uploaded, thank you!

List of changes

We have merged the reply document and the revised manuscript into one document (due to the fact that our reply document has some figures in it, which is not supported by the SciPost submission system website for filling in). Therefore, all responses are in the file we uploaded, thank you!

Published as SciPost Phys. 14, 132 (2023)


Reports on this Submission

Report 2 by Hao Wang on 2023-3-19 (Invited Report)

Report

In this revision, the authors have worked on every bit of my previous comment very carefully, and with the newly added appendices A and B, I think this version looks very precise and solid.

I checked through the appendices and the modified parts of the main text and did not spot any major issues. Only some minor things:

* I think you missed the right parentheses in the second line of the centered equation in Appendix A.
* In Appendix B, you might want to enlarge some parentheses/brackets to make them look a bit better.

But the above tiny comments can be easily resolved in your final version.

Again, I'd like to address that the authors did a good work of providing the upper bound for the bias term of the gradient.

In all, I recommend this manuscript for publication.

  • validity: -
  • significance: -
  • originality: -
  • clarity: -
  • formatting: -
  • grammar: -

Anonymous Report 1 on 2023-2-8 (Invited Report)

Report

The authors have considered all the comments/questions/suggestions of the referees, and the paper has significantly improved in clarity. It also sends better across the message about its importance. Hence, I now recommend it for publication.

  • validity: -
  • significance: -
  • originality: -
  • clarity: -
  • formatting: -
  • grammar: -

Login to report


Comments

Anonymous on 2023-01-17  [id 3243]

Category:
correction

Sorry, we forgot to update a figure in yesterday's submission version, please read our updated version. Thanks!

Attachment:

Updated_merged_files_reply_and_maintext.pdf