Processing math: 100%
SciPost logo

SciPost Submission Page

Bayesian RG Flow in Neural Network Field Theories

by Jessica N. Howard, Marc S. Klinger, Anindita Maiti, Alexander G. Stapleton

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): Jessica N. Howard
Submission information
Preprint Link: https://arxiv.org/abs/2405.17538v3  (pdf)
Code repository: https://github.com/xand-stapleton/bayes-nn-ft
Date accepted: Feb. 18, 2025
Date submitted: Feb. 11, 2025, 7:15 p.m.
Submitted by: Howard, Jessica N.
Submitted to: SciPost Physics Core
Ontological classification
Academic field: Physics
Specialties:
  • High-Energy Physics - Theory
Approaches: Theoretical, Computational

Abstract

The Neural Network Field Theory correspondence (NNFT) is a mapping from neural network (NN) architectures into the space of statistical field theories (SFTs). The Bayesian renormalization group (BRG) is an information-theoretic coarse graining scheme that generalizes the principles of the exact renormalization group (ERG) to arbitrarily parameterized probability distributions, including those of NNs. In BRG, coarse graining is performed in parameter space with respect to an information-theoretic distinguishability scale set by the Fisher information metric. In this paper, we unify NNFT and BRG to form a powerful new framework for exploring the space of NNs and SFTs, which we coin BRG-NNFT. With BRG-NNFT, NN training dynamics can be interpreted as inducing a flow in the space of SFTs from the information-theoretic `IR' UV.Conversely,applyganformation-shellcoarsegragthetraedtworksparametersducesaflowthespaceofSFTsomtheformation-theoreticUV' `IR'. When the information-theoretic cutoff scale coincides with a standard momentum scale, BRG is equivalent to ERG. We demonstrate the BRG-NNFT correspondence on two analytically tractable examples. First, we construct BRG flows for trained, infinite-width NNs, of arbitrary depth, with generic activation functions. As a special case, we then restrict to architectures with a single infinitely-wide layer, scalar outputs, and generalized cos-net activations. In this case, we show that BRG coarse-graining corresponds exactly to the momentum-shell ERG flow of a free scalar SFT. Our analytic results are corroborated by a numerical experiment in which an ensemble of asymptotically wide NNs are trained and subsequently renormalized using an information-shell BRG scheme.

Author comments upon resubmission

We greatly appreciate the feedback of the referees and have implemented changes to address their suggestions. We believe that this has improved the clarity of this work and has hopefully helped make it amenable to a wider audience. We provide a list of changes below.

List of changes

Below is a point-by-point list of changes in response to Referee 1’s feedback:
1. We have added a sentence explaining the meaning of ϕ in Section 2.2: "Here, ϕi is the value of the target function ϕ which we would like our NN to approximate evaluated at xi, i.e. ϕi=ϕ(xi)". We have also standardized our notation to use ϕ when referring to the target function throughout the text.
2. We added text clarifying that the tilde on ˜G(n) indicates that this is a sampled (empirical) estimate of G(n) in Eq. 31 (which is now Eq. 37). In the limit of an infinite number of samples, ˜G(n) should approach G(n).
3. We appreciate Referee 1 catching the typo in Eq. 32 (now Eq. 39), it has been fixed.
4. We have added a short description (below what is now Eq. (43)) expanding on the connection between the connected correlation functions and the couplings g(n) and discussing where in the literature more details on this topic can be found.
5. As suggested by Referee 1, we have better defined the notation used for normally distributed random variables around what is now Eq. 61.

In response to Referee 2’s feedback, we also added a concrete example of Bayesian inference at the end of Section 2.1 which we hope illustrates some of the abstract concepts discussed.

We hope that these changes adequately addressed the referees’ concerns and believe that they have helped improve the clarity of our work.

Published as SciPost Phys. Core 8, 027 (2025)

Login to report or comment