Physics ∩ ML Seminars |
Events on Wednesday, September 9th, 2020
- Insights on gradient-based algorithms in high-dimensional learning
- Time: 11:00 am - 12:30 pm
- Place: Please register for this online event:
- Speaker: Lenka Zdeborova, Université Paris-Saclay
- Abstract: Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multi-pass SGD, are at the center of attention in machine learning. Yet their behaviour remains perplexing, in particular in the high-dimensional non-convex setting. In this talk, I will present several high-dimensional and (mostly) non-convex statistical learning problems in which the performance of gradient-based algorithms can be analysed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model, the perceptron or phase retrieval.
- Host: Gary Shiu