EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

Statistics Department Seminar presents "Two mathematical lessons of deep learning"

Topic: 
Two mathematical lessons of deep learning
Tuesday, October 20, 2020 - 4:30pm
Speaker: 
Mikhail Belkin (Halıcıoğlu Data Science Institute)
Abstract / Description: 

Recent empirical successes of deep learning have exposed significant gaps in our fundamental understanding of learning and optimization mechanisms. Modern best practices for model selection are in direct contradiction to the methodologies suggested by classical analyses. Similarly, the efficiency of SGD-based local methods used in training modern models appeared at odds with the standard intuitions on optimization.

First, I will present the evidence, empirical and mathematical, that necessitates revisiting classical notions such as over-fitting. I will continue to discuss the emerging understanding of generalization and, in particular, the "double descent" risk curve, which extends the classical U-shaped generalization curve beyond the point of interpolation.

Second, I will discuss why the landscapes of over-parameterized neural networks are essentially never convex, even locally. Yet, they satisfy the local Polyak–Lojasiewicz condition, which allows SGD-type methods to converge to a global minimum.

A key piece of the puzzle remains: how do these lessons come together to form a complete mathematical picture of modern DL?