Image
Stanford EE

Recent progress in high-dimensional asymptotics of empirical risk minimization

Summary
Basil Saeed (Stanford)
Packard 339
Apr
25
This event ended 304 days ago.
Date(s)
Content

Abstract: Modern machine learning methods are often trained and deployed in high-dimensional settings, where the data dimension d is comparable to, or even exceeds, the sample size n. In such settings, classical “textbook” statistical theory—developed under the assumption that n is much larger than d—fails to accurately describe the behavior of learning methods. As an alternative, high-dimensional asymptotic theory offers a powerful framework for understanding these settings. It enables precise characterization of performance and sheds light on phenomena that are not present in the classical regime. In this talk, we will explore high-dimensional asymptotics for empirical risk minimization. While most existing results focus on convex losses and single-index models, we will present recent progress in extending these to non-convex losses and multi-index models. Based on joint work with Kiana Asgari and Andrea Montanari.

Bio: Basil Saeed is a PhD candidate at Stanford.