EE Student Information

EE Student Information, Spring Quarter 19-20: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

Information Systems Lab (ISL) Colloquium

ISL Colloquium presents "Federated Learning at Google and Beyond"

Topic: 
Federated Learning at Google and Beyond
Abstract / Description: 

Federated Learning enables mobile devices to collaboratively learn a shared prediction model or analytic while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud. It embodies the principles of focused collection and data minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning. In this talk, I will discuss: (1) how federated learning differs from more traditional distributed machine learning paradigms, focusing on the main defining characteristics and challenges of the federated learning setting; (2) practical algorithms for federated learning that address the unique challenges of this setting; (3) extensions to federated learning, including differential privacy, secure aggregation, and compression for model updates, and (4) a range of valuable research directions that could have significant real-world impact.

Date and Time: 
Friday, February 28, 2020 - 1:15pm
Venue: 
Packard 202

RL forum presents "Temporal Abstraction in Reinforcement Learning with the Successor Representation"

Topic: 
Temporal Abstraction in Reinforcement Learning with the Successor Representation
Abstract / Description: 

Reasoning at multiple levels of temporal abstraction is one of the key abilities for artificial intelligence. In the reinforcement learning problem, this is often instantiated with the options framework. Options allow agents to make predictions and to operate at different levels of abstraction within an environment. Nevertheless, when a reasonable set of options is not known beforehand, there are no definitive answers for characterizing which options one should consider. Recently, a new paradigm for option discovery has been introduced. This paradigm is based on the successor representation (SR), which defines state generalization in terms of how similar successor states are. In this talk I'll discuss the existing methods from this paradigm, providing a big picture look at how the SR can be used in the options framework. I'll present methods for discovering "bottleneck" options, as well as options that improve an agent's exploration capabilities. I'll also discuss the option keyboard, which uses the SR to extend a finite set of options to a combinatorially large counterpart without additional learning.

Date and Time: 
Tuesday, February 25, 2020 - 2:00pm
Venue: 
Packard 202

ISL Colloquium presents "Fully Convolutional Pixelwise Context-Adaptive Denoiser"

Topic: 
Fully Convolutional Pixelwise Context-Adaptive Denoiser
Abstract / Description: 

Denoising is a classical problem in signal processing and information theory, and various different methods have been applied to tackle the problem for several decades. Recently, supervised-trained neural network-based methods have achieved impressive denoising performances, significantly surpassing those of the classical approaches, such as prior- or optimization-based denoisers. However, there are two drawbacks on those methods; they are not adaptive, i.e., the neural- network cannot correct itself when distributional mismatch between training and test data exists, and they require clean source data and exact noise model for training, which is not always possible in some practical scenarios. In this talk, I will introduce a framework that tries to tackle above two drawbacks jointly, based on an unbiased estimate of the loss of a particular class of pixelwise context- adaptive denoisers. Using the framework and neural networks to learn the denoisers, I show the resulting image denoiser can adapt to mismatched distributions in the data solely based on the given noisy images, and achieve the state-of-the-art performances on several benchmark datasets. Moreover, combined with the standard noise transform/estimation techniques, I will show that our denoiser can be completely blindly trained only with the noisy images (and without exact noise model) and yet be very effective for denoising more sophisticated, source-dependent real-world noise, e.g., Poisson- Gaussian noise.

This is a joint work with my students, Sungmin Cha and Jaeseok Byun at SKKU.

Date and Time: 
Friday, February 21, 2020 - 1:15pm
Venue: 
Packard 202

ISL & IT-Forum present "Approaching Capacity at Short Blocklengths"

Topic: 
Approaching Capacity at Short Blockengths
Abstract / Description: 

This talk explores a family of recent results directed to approaching capacity at short blocklengths on the order of 50-500 channel transmissions. Convolutional codes out-perform polar codes and LDPC codes to approach the random coding union bound with low complexity when used with an optimized CRCs and list decoding. This perspective rehabilitates "catastrophic" convolutional codes, which are more properly understood for finite blocklengths as clever expurgation rather than any sort of catastrophe. This approach also provides a low-complexity approach for maximum-likelihood decoding of high-rate BCH codes. The use of variable length coding, i.e. incremental redundancy controlled with simple ACK/NACK feedback, allows capacity to be closely approached by practical codes with fewer than 500 channel uses. This talk reviews the information-theoretic results of Polyanskiy with respect to ACK/NACK feedback, presents new results extending the classic approach of Horstein for full feedback, and shows how to optimize the number and length of incremental redundancy transmissions (and feedback transmissions) for a variable-length code with feedback (i.e. a type-II hybrid ARQ). The talk also shows how to avoid entirely the overhead of a CRC in a hybrid ARQ setting by directly computing the reliability of convolutional codeword decisions. Finally, attendees will learn about a novel communications architecture that allows the use of incremental redundancy even without feedback.

 

Date and Time: 
Friday, January 24, 2020 - 1:15pm
Venue: 
Packard 202

ISL Colloquium presents "Codification Design in Compressive Imaging"

Topic: 
Codification Design in Compressive Imaging
Abstract / Description: 

Compressive imaging enables faster acquisitions by capturing coded projections of the scenes. Codification elements used in compressive imaging systems include lithographic masks, gratings and micro-polarizers, which sense spatial, spectral, and temporal data. Codification plays a key role in compressive imaging as it determines the number of projections needed for correct reconstruction. In general, random coding patterns are sufficient for accurate reconstruction. Still, more recent studies have shown that code design not only yields to improved image reconstructions, but it can also reduce the number of required projections. This talk covers different tools for codification design in compressive imaging, such as the restricted isometry property, geometric and deep learning approaches. Applications in compressive spectral video, compressive X-ray computed tomography, and seismic acquisition will be also discussed.


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, March 5, 2020 - 4:30pm
Venue: 
Packard 101

ISL Colloquium presents "Empirical Risk Minimization in High-dimensions: Asymptotics, Optimality and Double Descent"

Topic: 
Empirical Risk Minimization in High-dimensions: Asymptotics, Optimality and Double Descent
Abstract / Description: 

At the heart of contemporary statistical signal processing problems, as well as modern machine-learning practices, lie high-dimensional inference tasks in which the number of unknown parameters is of the same order as (and often larger than) the number of observations.

In this talk, I describe a framework based on Gaussian-process inequalities to sharply characterize the statistical performance of convex empirical risk minimization in high dimensions. By focusing on the simple, yet highly versatile, model of binary linear classification, I will demonstrate that, albeit challenging, sharp results are advantageous over loose order-wise bounds. For instance, they lead to precise answers to the following questions: When are training data linearly separable? Is least-squares bad for binary classification? What is the best convex loss function? Is double descent observed in linear models and how do its features (such as the transition threshold and global minima) depend on the training data and on the learning procedure?

Many of the ideas and technical tools of our work originate from the study of sharp phase-transitions in compressed sensing. Throughout the talk, I will highlight how our results relate to and advance this literature.


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, February 27, 2020 - 4:30pm
Venue: 
Packard 101

ISL Colloquium presents "Network Systems, Kuramoto Oscillators, and Synchronous Power Flow"

Topic: 
Network Systems, Kuramoto Oscillators, and Synchronous Power Flow
Abstract / Description: 

Network systems are mathematical models for the study of cooperation, propagation, synchronization and other dynamical phenomena that arise among interconnected agents. Network systems are widespread in science and technology as fundamental modeling tools.

This talk will review established and emerging frameworks for modeling, analysis and design of network systems. Next, I will focus on recent developments on the analysis of security and transmission capacity in power grids. I will review the Kuramoto model of coupled oscillators and present recent results on its synchronization behavior. I will also review recent results on multi-stability and multistable synchronous power flows.


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, February 20, 2020 - 4:30pm
Venue: 
Packard 101

ISL Colloquium presents "Recurrent Switching Linear Dynamical Systems for Neural and Behavioral Analysis"

Topic: 
Recurrent Switching Linear Dynamical Systems for Neural and Behavioral Analysis
Abstract / Description: 

The trend in neural recording capabilities is clear: we can record orders of magnitude more neurons now than we could only a few years ago, and technological advances do not seem to be slowing. Coupled with rich behavioral measurements, genetic sequencing, and connectomics, these datasets offer unprecedented opportunities to learn how neural circuits function. But they also pose serious modeling and algorithmic challenges. We need flexible yet interpretable probabilistic models to gain insight from these heterogeneous data and algorithms to efficiently and reliably fit them. I will present some recent work on recurrent switching linear dynamical systems (rSLDS) — models that couple discrete and continuous latent states to model nonlinear processes. I will discuss some approximate Bayesian inference algorithms we've developed to fit these models and infer their latent states, and I will show how these methods can help us gain insight into complex spatiotemporal datasets like those we study in neuroscience.


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, February 13, 2020 - 4:30pm
Venue: 
Packard 101

ISL Colloquium presents "How to trap a gradient flow"

Topic: 
How to trap a gradient flow
Abstract / Description: 

I will discuss a new strategy to find stationary points of non-convex functions in low-dimensional spaces. In particular we resolve an open problem from 1993 by Stephen A. Vavasis on the complexity of this problem in 2D.

Joint work with Dan Mikulincer.


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, February 6, 2020 - 4:30pm
Venue: 
Packard 101

ISL Colloquium presents "From Feedforward-Designed Convolutional Neural Networks (FF-CNNs) to Successive Subspace Learning (SSL)"

Topic: 
From Feedforward-Designed Convolutional Neural Networks (FF-CNNs) to Successive Subspace Learning (SSL)
Abstract / Description: 

Given a convolutional neural network (CNN) architecture, its network parameters are typically determined by backpropagation (BP). The underlying mechanism remains to be a black-box after a large amount of theoretical investigation. In this talk, I will first describe a new interpretable feedforward (FF) design with the LeNet-5 as an example. The FF-designed CNN is a data-centric approach that derives network parameters based on training data statistics layer by layer in a one-pass feedforward manner. To build the convolutional layers, we develop a new signal transform, called the Saab (Subspace approximation with adjusted bias) transform. The bias in filter weights is chosen to annihilate nonlinearity of the activation function. To build the fully-connected (FC) layers, we adopt a label-guided linear least squared regression (LSR) method. To generalize the FF design idea furthermore, we present the notion of "successive subspace learning (SSL)" and present a couple of concrete methods for image and point cloud classification. Experimental results are given to demonstrate the competitive performance of the SSL-based systems. Similarities and differences between SSL and deep learning (DL) are compared.

 

 


The Information Systems Laboratory Colloquium (ISLC) is typically held in Packard 101 every Thursday at 4:30 pm during the academic year. Coffee and refreshments are served at 4pm in the second floor kitchen of Packard Bldg.

The Colloquium is organized by graduate students Joachim Neu, Tavor Baharav and Kabir Chandrasekher. To suggest speakers, please contact any of the students.

Date and Time: 
Thursday, January 30, 2020 - 4:30pm
Venue: 
Packard 101

Pages

Subscribe to RSS - Information Systems Lab (ISL) Colloquium