Les Houches Summer School Lecture Notes
- a series contained in
-
SciPost Physics Lecture Notes
Collection 2022-07: Statistical Physics & Machine learning
The school is aimed primarily at the growing audience of theoretical physicists, applied mathematicians, computer scientists and colleagues from other computational fields interested in machine learning, neural networks, and high-dimensional data analysis. We shall cover basics and frontiers of high-dimensional statistics, machine learning, theory of computing and statistical learning, and the related mathematics and probability theory. We will put a special focus on methods of statistical physics and their results in the context of current questions and theories related to these problems. Open questions and directions will be discussed as well.
Organizers
- Florent Krzakala (EPFL)
- Lenka Zdeborová (EPFL)
Lecturers
- Francis Bach (Inria, ENS): Sums-of-squares: from polynomials to kernels [videos]
- Yasaman Bahri (Google) and Boris Hanin (Princeton): Deep learning at large and infinite width
- Boaz Barak (Harvard): Computational complexity of deep learning: Fundamental limitations and empirical phenomena
- Giulio Biroli (ENS Paris): High-dimensional non-convex landscapes and gradient descent dynamics
- Michael I. Jordan (Berkeley) On decision, dynamics, incentives, and mechanisms design
- Julia Kempe (NYU): Data, physics and kernels and how can (statistical) physics tools help the DL practitioner
- Yann LeCun (Facebook & NYU): From machine learning to autonomous intelligence
- Marc MĂ©zard (ENS Paris): Belief propagation, message-passing & sparse models
- Remi Monasson (ENS Paris): Replica method for computational problems with randomness: principles and illustrations
- Andrea Montanari (Stanford): Neural networks from a nonparametric viewpoint
- Sara Solla (Northwestern Univ.): Statistical physics, Bayesian inference and neural information processing
- Haim Sompolinsky (Harvard & Hebrew Univ.): Statistical mechanics of machine learning
- Nathan Srebro (TTI Chicago) Applying statistical learning theory to deep learning
- Eric Vanden-Eijnden (NYU Courant): Benefits of overparametrization in statistical learning, & enhancing MCMC sampling with learning
Dates: from 2022-07-04 to 2022-07-29.
Publications in this Collection
- No Publication has yet been associated to this Collection