EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

Statistics and Probability Seminars

Statistics Department Seminar presents "Two mathematical lessons of deep learning"

Two mathematical lessons of deep learning
Abstract / Description: 

Recent empirical successes of deep learning have exposed significant gaps in our fundamental understanding of learning and optimization mechanisms. Modern best practices for model selection are in direct contradiction to the methodologies suggested by classical analyses. Similarly, the efficiency of SGD-based local methods used in training modern models appeared at odds with the standard intuitions on optimization.

First, I will present the evidence, empirical and mathematical, that necessitates revisiting classical notions such as over-fitting. I will continue to discuss the emerging understanding of generalization and, in particular, the "double descent" risk curve, which extends the classical U-shaped generalization curve beyond the point of interpolation.

Second, I will discuss why the landscapes of over-parameterized neural networks are essentially never convex, even locally. Yet, they satisfy the local Polyak–Lojasiewicz condition, which allows SGD-type methods to converge to a global minimum.

A key piece of the puzzle remains: how do these lessons come together to form a complete mathematical picture of modern DL?

Date and Time: 
Tuesday, October 20, 2020 - 4:30pm

Statistics Department Seminar presents "Backfitting for large-scale crossed random effects regressions"

Backfitting for large-scale crossed random effects regressions
Abstract / Description: 

Large-scale genomic and electronic commerce data sets often have a crossed random effects structure, arising from genotypes x environments or customers x products. Naive methods of handling such data will produce inferences that do not generalize. Regression models that properly account for crossed random effects can be very expensive to compute. The cost of both generalized least squares and Gibbs sampling can easily grow as N^(3/2) (or worse) for N observations. Papaspiliopoulos, Roberts and Zanella (2020) present a collapsed Gibbs sampler that costs O(N), but under an extremely stringent sampling model. We propose a backfitting algorithm to compute a generalized least squares estimate and prove that it costs O(N) under greatly relaxed though still strict sampling assumptions. Empirically, the backfitting algorithm costs O(N) under further relaxed assumptions. We illustrate the new algorithm on a ratings data set from Stitch Fix.

This is based on joint with Swarnadip Ghosh and Trevor Hastie of Stanford University.

Date and Time: 
Tuesday, October 13, 2020 - 4:30pm

Statistics Department Seminar presents "Berry–Esseen bounds for Chernoff-type nonstandard asymptotics in isotonic regression"

Berry–Esseen bounds for Chernoff-type nonstandard asymptotics in isotonic regression
Abstract / Description: 

A Chernoff-type distribution is a non-normal distribution defined by the slope at zero of the greatest convex minorant of a two-sided Brownian motion with a polynomial drift. While a Chernoff-type distribution appears as the distributional limit in many nonregular estimation problems, the accuracy of Chernoff-type approximations has been largely unknown. In this talk, I will discuss Berry–Esseen bounds for Chernoff-type limit distributions in the canonical nonregular statistical estimation problem of isotonic (or monotone) regression. The derived Berry–Esseen bounds match those of the oracle local average estimator with optimal bandwidth in each scenario of possibly different Chernoff-type asymptotics, up to multiplicative logarithmic factors. Our method of proof differs from standard techniques on Berry–Esseen bounds, and relies on new localization techniques in isotonic regression and an anti-concentration inequality for the supremum of a Brownian motion with a Lipschitz drift.

This talk is based on joint work with Qiyang Han.

Date and Time: 
Tuesday, October 6, 2020 - 4:30pm


Subscribe to RSS - Statistics and Probability Seminars