EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

SCIEN Talk

SCIEN Talk: Accelerated Computing for Light Field and Holographic Displays

Topic: 
Accelerated Computing for Light Field and Holographic Displays
Abstract / Description: 

In this talk, I will present two recently published papers at the annual SIGGRAPH ASIA 2017. For the first paper, we present a 4D light field sampling and rendering system for light field displays that can support both foveation and accommodation to reduce rendering cost while maintaining perceptual quality and comfort. For the second paper, we present a light field based Computer Generated Holography (CGH) rendering pipeline allowing for reproduction of high-definition 3D scenes with continuous depth and support of intra-pupil view dependent occlusion using computer generated hologram. Our rendering and Fresnel integral accurately accounts for diffraction and supports various types of reference illumination for holograms.

Date and Time: 
Wednesday, February 14, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Street View 2018 - The Newest Generation of Mapping Hardware

Topic: 
Street View 2018 - The Newest Generation of Mapping Hardware
Abstract / Description: 

A brief overview of Street View from it's inception 10 years ago until now will be presented. Street level Imagery has been the prime objective for Google's Street View in the past, and has now migrated into a state-of-the-art mapping platform. Challenges and solutions to the design and fabrication of the imaging system and optimization of hardware to align with specific software post processing will be discussed. Real world challenges of fielding hardware in 80+ countries will also be addressed.

Date and Time: 
Wednesday, February 7, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Learning where to look in 360 environments

Topic: 
Learning where to look in 360 environments
Abstract / Description: 

Many vision tasks require not just categorizing a well-composed human-taken photo, but also intelligently deciding "where to look" in order to get a meaningful observation in the first place. We explore how an agent can anticipate the visual effects of its actions, and develop policies for learning to look around actively---both for the sake of a specific recognition task as well as for generic exploratory behavior. In addition, we examine how a system can learn from unlabeled video to mimic human videographer tendencies, automatically deciding where to look in unedited 360 degree panoramas. Finally, to facilitate 360 video processing, we introduce spherical convolution, which allows application of off-the-shelf deep networks and object detectors to 360 imagery.

Date and Time: 
Wednesday, January 24, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Driverless Anything and the Role of LiDAR

Topic: 
Driverless Anything and the Role of LiDAR
Abstract / Description: 

LiDAR, or light detection and ranging, is a versatile light-based remote sensing technology that has been the subject of a great deal of attention in recent times. It has shown up in a number of media venues, and has even led to public debate about engineering choices of a well-known electric car company, Tesla Motors. During this talk the speaker will provide some background on LiDAR and discuss why it is a key link to the future autonomous vehicle ecosystem as well as its strong connection to power electronics technologies.

Date and Time: 
Wednesday, January 17, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Advancing Healthcare with AI and VR

Topic: 
Advancing Healthcare with AI and VR
Abstract / Description: 

Quality, cost, and accessibility form an iron triangle that has prevented healthcare from achieving accelerated advancement in the last few decades. Improving any one of the three metrics may lead to degradation of the other two. However, thanks to recent breakthroughs in artificial intelligence (AI) and virtual reality (VR), this iron triangle can finally be shattered. In this talk, I will share the experience of developing DeepQ, an AI platform for AI-assisted diagnosis and VR-facilitated surgery. I will present three healthcare initiatives we have undertaken since 2012: Healthbox, Tricorder, and VR surgery, and explain how AI and VR play pivotal roles in improving diagnosis accuracy and treatment effectiveness. And more specifically, how we have dealt with not only big data analytics, but also small data learning, which is typical in the medical domain. The talk concludes with roadmaps and a list of open research issues in signal processing and AI to achieve precision medicine and surgery.

Date and Time: 
Wednesday, January 10, 2018 - 4:30pm
Venue: 
Packard 101

SCIEN & EE 292E: Compressed Ultrafast Photography and Microscopy: Redefining the Limit of Passive Ultrafast Imaging

Topic: 
Compressed Ultrafast Photography and Microscopy: Redefining the Limit of Passive Ultrafast Imaging
Abstract / Description: 

High-speed imaging is an indispensable technology for blur-free observation of fast transient dynamics in virtually all areas including science, industry, defense, energy, and medicine. Unfortunately, the frame rates of conventional cameras are significantly constrained by their data transfer bandwidth and onboard storage. We demonstrate a two-dimensional dynamic imaging technique, compressed ultrafast photography (CUP), which can capture non-repetitive time-evolving events at up to 100 billion fps. Compared with existing ultrafast imaging techniques, CUP has a prominent advantage of measuring an x, y, t (x, y, spatial coordinates; t, time) scene with a single camera snapshot, thereby allowing observation of transient events occurring on a time scale down to tens of picoseconds. Thanks to the CUP technology, for the first time, the human can see light pulses on the fly. Because this technology advances the imaging frame rate by orders of magnitude, we now enter a new regime and open new visions.

In this talk, I will discuss our recent effort to develop a second-generation CUP system and demonstrate its applications at scales from macroscopic to microscopic. For the first time, we imaged photonic Mach cones and captured "Sonic Boom" of light in action. Moreover, by adapting CUP for microscopy, we enabled two-dimensional fluorescence lifetime imaging at an unprecedented speed. The advantage of CUP recording is that even visually simple systems can be scientifically interesting when they are captured at such a high speed. Given CUP's capability, we expect it to find widespread applications in both fundamental and applied sciences including biomedical research.

Date and Time: 
Wednesday, December 6, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Next Generation Wearable AR Display Technologies

Topic: 
Next Generation Wearable AR Display Technologies
Abstract / Description: 

Wearable AR/VR displays have a long history and earlier efforts failed due to various limitations. Advances in sensors, optical technologies, and computing technologies renewed the interest in this area. Most people are convinced AR will be very big. A key question is whether AR glasses can be the new computing platform and replace smart phones? I'll discuss some of the challenges ahead. We have been working on various wearable display architectures and I'll discuss our efforts related to MEMS scanned beam displays, head-mounted projectors and smart telepresence screens, and holographic near-eye displays.

Date and Time: 
Wednesday, November 29, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Near-Eye Varifocal Augmented Reality Displays

Topic: 
Near-Eye Varifocal Augmented Reality Displays
Abstract / Description: 

With the goal of registering dynamic synthetic imagery onto the real world, Ivan Sutherland envisioned a fundamental idea to combine digital displays with conventional optical components in a wearable fashion. Since then, various new advancements in the display engineering domain, and a broader understanding in the vision science domain have led us to computational displays for virtual reality and augmented reality applications. Today, such displays promise a more realistic and comfortable experience through techniques such as lightfield displays, holographic displays, always-in-focus displays, multiplane displays, and varifocal displays. In this talk, as an Nvidian, I will be presenting our new optical layouts for see-through computational near-eye displays that is simple, compact, varifocal, and provides a wide field of view with clear peripheral vision and large eyebox. Key to our efforts so far contain novel see-through rear-projection holographic screens, and deformable mirror membranes. We establish fundamental trade-offs between the quantitative parameters of resolution, field of view, and the form-factor of our designs; opening an intriguing avenue for future work on accommodation-supporting augmented reality display.

Date and Time: 
Wednesday, November 15, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN & EE292E seminar: Interactive 3D Digital Humans

Topic: 
Interactive 3D Digital Humans
Abstract / Description: 

This talk will cover recent methods for recording and displaying interactive life-sized digital humans using the ICT Light Stage, natural language interfaces, and automultiscopic 3D displays. We will then discuss the first full application of this technology to preserve the experience of in-person interactions with Holocaust survivors

More Information: http://gl.ict.usc.edu/Research/TimeOffsetConversations/


The SCIEN Colloquia are open to the public. The talks are also videotaped and posted the following week on talks.stanford.edu.

There will a reception following the presentation.

Date and Time: 
Wednesday, November 8, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Mapping molecular orientation using polarized light microscopy

Topic: 
Mapping molecular orientation using polarized light microscopy
Abstract / Description: 

Polarization is a basic property of light, but the human eye is not sensitive to it. Therefore, we don't have an intuitive understanding of polarization and of optical phenomena that are based on it. They either elude us, like the polarization of the blue sky or the rainbow, or they puzzle us, like the effect of Polaroid sunglasses. Meanwhile, polarized light plays an important role in nature and can be used to manipulate and analyze molecular order in materials, including living cells, tissues, and whole organisms, by observation with the polarized light microscope.

In this seminar, Rudolf Oldenbourg will first illustrate the nature of polarized light and its interaction with aligned materials using hands-on demonstrations. He will then introduce a modern version of the polarized light microscope, the LC-PolScope, created at the MBL. Enhanced by liquid crystal devices, electronic imaging, and digital image processing techniques, the LC-PolScope reveals and measures the orientation of molecules in every resolved specimen point at once. In recent years, his lab expanded the LC-PolScope technique to include the measurement of polarized fluorescence of GFP and other fluorescent molecules, and applied it to record the remarkable choreography of septin proteins during cell division, displayed in yeast to mammalian cells.

Talon Chandler will then discuss extending polarized light techniques to multi-view microscopes, including light sheet and light field microscopes. In contrast to traditional, single-view microscopy, the recording of specimen images along two or more viewing directions allows us to unambiguously measure the three dimensional orientation of molecules and their aggregates. Chandler will discuss ongoing work on optimizing the design and reconstruction algorithms for multi-view polarized light microscopy.


The SCIEN Colloquia are open to the public. The talks are also videotaped and posted the following week on talks.stanford.edu.

There will a reception following the presentation.

Date and Time: 
Wednesday, November 1, 2017 - 4:30pm
Venue: 
Packard 101

Pages

Subscribe to RSS - SCIEN Talk