EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

SCIEN Talk

SCIEN Talk: Computational microscopy of dynamic order across biological scales

Topic: 
Computational microscopy of dynamic order across biological scales
Abstract / Description: 

Living systems are characterized by emergent behavior of ordered components. Imaging technologies that reveal dynamic arrangement of organelles in a cell and of cells in a tissue are needed to understand the emergent behavior of living systems. I will present an overview of challenges in imaging dynamic order at the scales of cells and tissue, and discuss advances in computational label-free microscopy to overcome these challenges.

 

Date and Time: 
Wednesday, October 17, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: How to train neural networks on LiDAR point clouds

Topic: 
How to train neural networks on LiDAR point clouds
Abstract / Description: 

Accurate LiDAR classification and segmentation is required for developing critical ADAS & Autonomous Vehicles components. Mainly, its required for high definition mapping and developing perception and path/motion planning algorithms. This talk will cover best practices for how to accurately annotate and benchmark your AV/ADAS models against LiDAR point cloud ground truth training data.

 

Date and Time: 
Wednesday, October 10, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: The challenge of large-scale brain imaging

Topic: 
The challenge of large-scale brain imaging
Abstract / Description: 

Advanced optical microscopy techniques have enabled the recording and stimulation of large populations of neurons deep within living, intact animal brains. I will present a broad overview of these techniques, and discuss challenges that still remain in performing large-scale imaging with high spatio-temporal resolution, along with various strategies that are being adopted to address these challenges.

Date and Time: 
Wednesday, October 3, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk, eWear seminar: 'Immersive Technology and AI' with focus on mobile AR research

Topic: 
'Immersive Technology and AI' with focus on mobile AR research
Abstract / Description: 

Talk Title: Saliency in VR: How Do People Explore Virtual Environments,presented by Vincent Sitzmann

Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-based compression.

Talk Title: "Immersive Technology and AI" with focus on mobile AR research

Abstract: not available

 

Date and Time: 
Thursday, May 31, 2018 - 3:30pm
Venue: 
Spilker 232

SCIEN & EE 292E: Mobile VR for vision testing and treatment

Topic: 
Mobile VR for vision testing and treatment
Abstract / Description: 

Consumer-level HMDs are adequate for many medical applications. Vivid Vision (VV) takes advantage of their low cost, light weight, and large VR gaming code base to make vision tests and treatments. The company's software is built using the Unity engine, which allows for multiplatform support.in the Unity framework, allowing it to run on many hardware platforms. New headsets are available every six months or less, which creates interesting challenges within in the medical device space. VV's flagship product is the commercially available Vivid Vision System, used by more than 120 clinics to test and treat binocular dysfunctions such as convergence difficulties, amblyopia, strabismus, and stereo blindness. VV has recently developed a new, VR-based visual field analyzer.

Date and Time: 
Wednesday, June 6, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: Emerging LIDAR concepts and sensor technologies for autonomous vehicles

Topic: 
Emerging LIDAR concepts and sensor technologies for autonomous vehicles
Abstract / Description: 

Sensor technologies such as radar, camera, and LIDAR have become the key enablers for achieving higher levels of autonomous control in vehicles, from fleets to commercial. There are, however, still questions remaining: to what extent will radar and camera technologies continue to improve, and which LIDAR concepts will be the most successful? This presentation will provide an overview of the tradeoffs for LIDAR vs. competing sensor technologies (camera and radar); this discussion will reinforce the need for sensor fusion. We will also discuss the types of improvements that are necessary for each sensor technology. The presentation will summarize and compare various LIDAR designs -- mechanical, flash, MEMS-mirror based, optical phased array, and FMCW (frequency modulated continuous wave) -- and then discuss each LIDAR concept's future outlook. Finally, there will be a quick review of guidelines for selecting photonic components such as photodetectors, light sources, and MEMS mirrors.

Date and Time: 
Wednesday, May 30, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: LiDAR Technology for Autonomous Vehicles

Topic: 
LiDAR Technology for Autonomous Vehicles
Abstract / Description: 

LiDAR is a key sensor for autonomous vehicles that enables them to understand their surroundings in 3 dimensions. I will discuss the evolution of LiDAR, and describe various LiDAR technologies currently being developed. These include rotating sensors, MEMs and Optical Phase Array scanning devices, flash detector arrays, and single photon avalanche detectors. Requirements for autonomous vehicles are very challenging, and the different technologies each have advantages and disadvantages that will be discussed. The architecture of LiDAR also affects how it fits into the overall vehicle architecture. Image fusion with other sensors including radar, cameras, and ultrasound will be part of the overall solution. Other LiDAR applications including non-automotive transportation, mining, precision agriculture, UAV's, mapping, surveying, and security will be described.

Date and Time: 
Wednesday, May 23, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: Pushing the Limits of Fluorescence Microscopy with adaptive imaging and machine learning

Topic: 
Pushing the Limits of Fluorescence Microscopy with adaptive imaging and machine learning
Abstract / Description: 

Fluorescence microscopy lets biologist see and understand the intricate machinery at the heart of living systems and has led to numerous discoveries. Any technological progress towards improving image quality would extend the range of possible observations and would consequently open up the path to new findings. I will show how modern machine learning and smart robotic microscopes can push the boundaries of observability. One fundamental obstacle in microscopy takes the form of a trade-of between imaging speed, spatial resolution, light exposure, and imaging depth. We have shown that deep learning can circumvent these physical limitations: microscopy images can be restored even if 60-fold fewer photons are used during acquisition, isotropic resolution can be achieved even with a 10-fold under-sampling along the axial direction, and diffraction-limited structures can be resolved at 20-times higher frame-rates compared to state-of-the-art methods. Moreover, I will demonstrate how smart microscopy techniques can achieve the full optical resolution of light-sheet microscopes — instruments capable of capturing the entire developmental arch of an embryo from a single cell to a fully formed motile organism. Our instrument improves spatial resolution and signal strength two to five-fold, recovers cellular and sub-cellular structures in many regions otherwise not resolved, adapts to spatiotemporal dynamics of genetically encoded fluorescent markers and robustly optimises imaging performance during large-scale morphogenetic changes in living organisms.

Date and Time: 
Wednesday, May 16, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: Advances in automotive image sensors

Topic: 
Advances in automotive image sensors
Abstract / Description: 

In this talk I present recent advances in 2D and 3D image sensors for automotive applications such as rear view cameras, surround view cameras, ADAS cameras and in cabin driver monitoring cameras. This includes developments in high dynamic range image capture, LED flicker mitigation, high frame rate capture, global shutter, near infrared sensitivity and range imaging. I will also describe sensor developments for short range and long range LIDAR systems.

Date and Time: 
Wednesday, May 9, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: 3D single-molecule super-resolution microscopy using a tilted light sheet

Topic: 
3D single-molecule super-resolution microscopy using a tilted light sheet
Abstract / Description: 

To obtain a complete picture of subcellular structures, cells must be imaged with high resolution in all three dimensions (3D). In this talk, I will present tilted light sheet microscopy with 3D point spread functions (TILT3D), an imaging platform that combines a novel, tilted light sheet illumination strategy with engineered long axial range point spread functions (PSFs) for low-background, 3D super localization of single molecules as well as 3D super-resolution imaging in thick cells. Here the axial positions of the single molecules are encoded in the shape of the PSF rather than in the position or thickness of the light sheet. TILT3D is built upon a standard inverted microscope and has minimal custom parts. The result is simple and flexible 3D super-resolution imaging with tens of nm localization precision throughout thick mammalian cells. We validated TILT3D for 3D super-resolution imaging in mammalian cells by imaging mitochondria and the full nuclear lamina using the double-helix PSF for single-molecule detection and the recently developed Tetrapod PSFs for fiducial bead tracking and live axial drift correction. We think that TILT3D in the future will become an important tool not only for 3D super-resolution imaging, but also for live whole-cell single-particle and single-molecule tracking.

Date and Time: 
Wednesday, May 2, 2018 - 4:30pm
Venue: 
Packard 101

Pages

Subscribe to RSS - SCIEN Talk