SciPost Submission Page
Simplified derivations for high-dimensional convex learning problems
by David G. Clark, Haim Sompolinsky
Submission summary
| Authors (as registered SciPost users): | David Clark |
| Submission information | |
|---|---|
| Preprint Link: | https://arxiv.org/abs/2412.01110v5 (pdf) |
| Date accepted: | Sept. 29, 2025 |
| Date submitted: | Sept. 20, 2025, 9:54 p.m. |
| Submitted by: | David Clark |
| Submitted to: | SciPost Physics Lecture Notes |
| Ontological classification | |
|---|---|
| Academic field: | Physics |
| Specialties: |
|
| Approach: | Theoretical |
Abstract
Statistical-physics calculations in machine learning and theoretical neuroscience often involve lengthy derivations that obscure physical interpretation. Here, we give concise, non-replica derivations of several key results and highlight their underlying similarities. In particular, using a cavity approach, we analyze three high-dimensional learning problems: perceptron classification of points, perceptron classification of manifolds, and kernel ridge regression. These problems share a common structure--a bipartite system of interacting feature and datum variables--enabling a unified analysis. Furthermore, for perceptron-capacity problems, we identify a symmetry that allows derivation of correct capacities through a naive method.
Author comments upon resubmission
List of changes
Published as SciPost Phys. Lect. Notes 105 (2025)
