EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring & Summer Quarters 19-20: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

IT Forum presents "Fundamental barriers to estimation in high-dimensions"

Topic: 
Fundamental barriers to estimation in high-dimensions
Friday, April 3, 2020 - 1:15pm
Venue: 
Zoom: stanford.zoom.us/j/516499996
Speaker: 
Michael Celentano, PhD candidate (Stanford)
Abstract / Description: 

Modern large-scale statistical models require to estimate thousands to millions of parameters. Understanding the tradeoff between statistical optimality and computational tractability in such models remains an outstanding challenge. Under a random design assumption, we establish lower bounds to statistical estimation with two popular classes of tractable estimators in several popular statistical models. First, in high-dimensional linear models we show that a large gap often exists between the optimal statistical error and that achieved by least squares with a convex penalty. Examples of such estimators include the Lasso, ridge regression, and MAP estimation with log-concave priors and Gaussian noise. Second, in generalized linear models and low-rank matrix estimation problems, we introduce the class of 'general first order methods,' examples of which include gradient descent, projected gradient descent, and their accelerated versions. We derive lower bounds for general first order methods which are tight up to asymptotically negligible terms. Our results demonstrate a gap to statistical optimality for general first order methods in both sparse phase retrieval and sparse PCA.

This is joint work with Andrea Montanari and Yuchen Wu.

 

Bio:

Michael Celentano is a fourth year Ph.D. student in the Department of Statistics at Stanford University, advised by Andrea Montanari. He is interested in developing theory for optimal estimation, exact asymptotics, and inference in high-dimensional statistical models. He is a recipient of the NSF Graduate Research Fellowship.