SciPost logo

SciPost Submission Page

Simplified derivations for high-dimensional convex learning problems

by David G. Clark, Haim Sompolinsky

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): David Clark
Submission information
Preprint Link: https://arxiv.org/abs/2412.01110v5  (pdf)
Date accepted: Sept. 29, 2025
Date submitted: Sept. 20, 2025, 9:54 p.m.
Submitted by: David Clark
Submitted to: SciPost Physics Lecture Notes
Ontological classification
Academic field: Physics
Specialties:
  • Condensed Matter Physics - Theory
  • Statistical and Soft Matter Physics
Approach: Theoretical

Abstract

Statistical-physics calculations in machine learning and theoretical neuroscience often involve lengthy derivations that obscure physical interpretation. Here, we give concise, non-replica derivations of several key results and highlight their underlying similarities. In particular, using a cavity approach, we analyze three high-dimensional learning problems: perceptron classification of points, perceptron classification of manifolds, and kernel ridge regression. These problems share a common structure--a bipartite system of interacting feature and datum variables--enabling a unified analysis. Furthermore, for perceptron-capacity problems, we identify a symmetry that allows derivation of correct capacities through a naive method.

Author comments upon resubmission

We sincerely thank the two referees for their thoughtful suggestions, which we have implemented.

List of changes

We have uploaded a detailed response PDF, in which the changes are enumerated in full, as part of our comments to the referees.
Current status:
Published

Editorial decision: For Journal SciPost Physics Lecture Notes: Publish
(status: Editorial decision fixed and (if required) accepted by authors)

Login to report or comment