EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

SCIEN Talk

SCIEN and EE292E present "Fluorescence Guided Precision Surgery TM – Illuminating Tumors and Nerves"

Topic: 
Fluorescence Guided Precision Surgery TM – Illuminating Tumors and Nerves
Abstract / Description: 

Although treatment algorithms vary, surgery is the primary treatment modality for most solid cancers. In oncologic tumor resection, the preferred outcome is complete cancer removal as residual tumor left behind is considered treatment failure. However complete tumor removal needs to be balanced with functional preservation and minimizing patient morbidity including prevention of inadvertent nerve injury. The inability of surgeons to visually distinguish between tumor and normal tissue including nerves leads to residual cancer cells being left behind at the edges of resection, i.e. positive surgical margins (PSM). PSM can be as high as 20-40% in breast cancer lumpectomy, 21% for radical prostatectomy, and 13% for HNSCC. Similarly, using white light reflectance alone which is the current standard of care in operating rooms, nerve dysfunction following surgery has been reported to be as high as ~2-40% ranging from immediate post op to long-term dysfunction.

Molecular imaging with fluorescence provides enhanced visual definition between diseased and normal tissue and have been shown to decrease PSM in both animal models and patients. Molecular imaging with fluorescence can also provide enhanced visualization of important structures such as nerves to improve preservation and minimize inadvertent injury. Our laboratory has extensive experience in development of both nerve and tumor injectable markers for surgical visualization. In presentation we will discuss the development of nerve and tumor markers combinations to improve intraoperative visualization – aka Precision Surgery TM.


The SCIEN Colloquia are now offered via Zoom - Please register by going to https://stanford.zoom.us/meeting/register/tJctd-utrT4rHtT5OO34glASg1vol-PCGuXR

Date and Time: 
Wednesday, July 8, 2020 - 4:30pm
Venue: 
Zoom

SCIEN and EE292E present "Next-generation technologies to enable high-performance, low-cost lidar"

Topic: 
Next-generation technologies to enable high-performance, low-cost lidar
Abstract / Description: 

 

Lidar systems are used in diverse applications, including autonomous driving and industrial automation. Despite the challenging requirements for these systems, most lidars in the market utilize legacy technologies such as scanning mechanisms, long-coherence-length or edge-emitting lasers, and avalanche photodiodes, and consequently offer limited performance and robustness at a relatively high price point. Other systems propose to utilize esoteric technologies which face an uphill struggle towards manufacturability. This talk will describe two complementary technologies, leveraging on years of progress in adjacent industries. When combined with novel signal processing algorithms, these deliver a high-resolution, long-range and low-cost solid-state flash lidar system, breaking the performance envelope and resulting in a camera-like system. We will discuss design trade-offs, performance roadmap into the future and remaining challenges.


 

Please register by going to https://stanford.zoom.us/meeting/register/tJAtcO2gqz0iGN0p94z4JU_I9-EQuQ3lBi7N

The talks are also videotaped and posted the following week on talks.stanford.edu for Stanford students, staff, faculty and SCIEN Affiliate Member companies.

If you wish to receive announcements about future talks, please add yourself to the SCIEN distribution list by going here.

 

 

Date and Time: 
Wednesday, July 1, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Codification Design in Compressive Imaging"

Topic: 
Codification Design in Compressive Imaging
Abstract / Description: 

Compressive imaging enables faster acquisitions by capturing coded projections of the scenes. Codification elements used in compressive imaging systems include lithographic masks, gratings and micro-polarizers, which sense spatial, spectral, and temporal data. Codification plays a key role in compressive imaging as it determines the number of projections needed for correct reconstruction. In general, random coding patterns are sufficient for accurate reconstruction. Still, more recent studies have shown that code design not only yields to improved image reconstructions, but it can also reduce the number of required projections. This talk covers different tools for codification design in compressive imaging, such as the restricted isometry property, geometric and deep learning approaches. Applications in compressive spectral video, compressive X-ray computed tomography, and seismic acquisition will be also discussed.

Date and Time: 
Wednesday, June 3, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Mojo Lens, the First True Smart Contact Lens"

Topic: 
Mojo Lens, the First True Smart Contact Lens
Abstract / Description: 

After working in stealth for over 4 years, Mojo Vision recently unveiled the first working prototype of Mojo Lens, a smart contact lens designed to deliver augmented reality content wherever you look. This talk will provide an overview of the company, its vision of "Invisible Computing", the science behind the world's first contact lens display, and a first-hand account of what it's like to actually wear Mojo Lens.

Date and Time: 
Wednesday, May 27, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Learned Image Synthesis for Computational Displays"

Topic: 
Learned Image Synthesis for Computational Displays
Abstract / Description: 

Addressing vergence-accommodation conflict in head-mounted displays (HMDs) requires resolving two interrelated problems. First, the hardware must support viewing sharp imagery over the full accommodation range of the user. Second, HMDs should accurately reproduce retinal defocus blur to correctly drive accommodation. A multitude of accommodation-supporting HMDs have been proposed, with three architectures receiving particular attention: varifocal, multifocal, and light field displays. These designs all extend depth of focus, but rely on computationally expensive rendering and optimization algorithms to reproduce accurate defocus blur (often limiting content complexity and interactive applications). To date, no unified framework has been proposed to support driving these emerging HMDs using commodity content. In this talk, we will present DeepFocus, a generic, end-to-end convolutional neural network designed to efficiently solve the full range of computational tasks for accommodation-supporting HMDs. This network is demonstrated to accurately synthesize defocus blur, focal stacks, multilayer decompositions, and multiview imagery using only commonly available RGB-D images, enabling real-time, near-correct depictions of retinal blur with a broad set of accommodation-supporting HMDs.

Date and Time: 
Wednesday, May 20, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents Practical 2D to 3D Image Conversion

Topic: 
Practical 2D to 3D Image Conversion
Abstract / Description: 

We will discuss techniques for converting 2D images to 3D meshes on mobile devices. This includes methods to efficiently compute both dense and sparse depth maps, converting depth maps into 3D meshes, mesh inpainting, and post-processing. We focus on different CNN designs to solve each step in the processing pipeline and examine common failure modes. Finally, we will look at practical deployment of image processing algorithms and CNNs on mobile apps, and how Lucid uses cloud processing to balance processing power with latency.

Date and Time: 
Wednesday, May 13, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "Bio-inspired depth sensing using computational optics"

Topic: 
Bio-inspired depth sensing using computational optics
Abstract / Description: 

Jumping spiders rely on accurate depth perception for predation and navigation. They accomplish depth perception, despite their tiny brains, by using specialized optics. Each principal eye includes a multitiered retina that simultaneously receives multiple images with different amounts of defocus, and distance is decoded from these images with seemingly little computation. In this talk, I will introduce two depth sensors that are inspired by jumping spiders. They use computational optics and build upon previous depth-from-defocus algorithms in computer vision. Both sensors operate without active illumination, and they are both monocular and computationally efficient.
The first sensor synchronizes an oscillating deformable lens with a photosensor. It produces depth and confidence maps at more than 100 frames per second and has the advantage of being able to extend its working range through optical accommodation. The second sensor uses a custom-designed metalens, which is an ultra-thin device with 2D nano-structures that modulate traversing light. The metalens splits incoming light and simultaneously forms two differently-defocused images on a planar photosensor, allowing the efficient computation of depth and confidence from a single snapshot in time.

Date and Time: 
Wednesday, May 6, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "Insight into the inner workings of Intel’s Stereo and Lidar Depth Cameras"

Topic: 
Insight into the inner workings of Intel’s Stereo and Lidar Depth Cameras
Abstract / Description: 

This talk will provide an overview of the technology and capabilities of Intel's RealSense Stereo and Lidar Depth Cameras, and will then progress to describe new features, such as high-speed capture, multi-camera enhancements, optical filtering, and near-range high-resolution depth imaging. Finally, we will introduce a new fast on-chip calibration method that can be used to improve the performance of a stereo camera and help mitigate some common stereo artifacts.

Date and Time: 
Wednesday, April 29, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "The Extreme Science of Building High-Performing Compact Cameras for Space Applications"

Topic: 
The Extreme Science of Building High-Performing Compact Cameras for Space Applications
Abstract / Description: 

A thickening flock of earth-observing satellites blankets the planet. Over 700 were launched during the past 10 years, and more than 2,200 additional ones are scheduled to go up within the next 10 years. To add to that, year on year, satellite platforms and instruments are being miniaturized to improve cost-efficiency with the same expectations of high spatial and spectral resolutions. But what does it take to build imaging systems that are high-performing in the harsh environment of space but cost-efficient and compact at the same time? This talk will touch upon the technical issues associated with the design, fabrication, and characterisation of such extremely high-performing but still compact and cost-efficient space cameras taking the example of the imager that Pixxel has built as part of its earth-imaging satellite constellations plans.

Date and Time: 
Wednesday, April 22, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "The Role of Fundamental Limits in 3D Imaging Systems: From Looking around Corners to Fast 3D Cameras"

Topic: 
The Role of Fundamental Limits in 3D Imaging Systems: From Looking around Corners to Fast 3D Cameras
Abstract / Description: 

The knowledge about limits is a precious commodity in computational imaging: By knowing that our imaging device already operates at the physical limit (e.g. of resolution), we can avoid unnecessary investments in better hardware, such as faster detectors, better optics or cameras with higher pixel resolution. Moreover, limits often appear as uncertainty products, making it possible to bargain with nature for a better measurement by sacrificing less important information.

In this talk, the role of physical and information limits in computational imaging will be discussed using examples from two of my recent projects: 'Synthetic Wavelength Holography' and the 'Single-Shot 3D Movie Camera'.

Synthetic Wavelength Holography is a novel method to image hidden objects around corners and through scattering media. While other approaches rely on time-of-flight detectors, which suffer from technical limitations in spatial and temporal resolution, Synthetic Wavelength Holography works at the physical limit of the space-bandwidth product. Full field measurements of hidden objects around corners or through scatterers reaching sub-mm resolution will be presented.

The single-shot 3D movie camera is a highly precise 3D sensor for the measurement of fast macroscopic live scenes. From each 1 Mpix camera frame, the sensor delivers 300,000 independent 3D points with high resolution. The single-shot ability allows for a continuous 3D measurement of fast moving or deforming objects, resulting in a continuous 3D movie. Like a hologram, each movie-frame encompasses the full 3D information about the object surface, and the observation perspective can be varied while watching the 3D movie.

Date and Time: 
Wednesday, April 15, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

Pages

Subscribe to RSS - SCIEN Talk