EE Student Information

The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

Please see Stanford University Health Alerts for course and travel updates.

As always, use your best judgement and consider your own and others' well-being at all times.

SCIEN Talk

OSA Color Technical Group presents "Modeling the Initial Steps of Human Vision"

Topic: 
Modeling the Initial Steps of Human Vision
Abstract / Description: 

Vision guides thought and action. To do so usefully it must inform us about critical features of the world around us. What we can learn about the world is limited by the initial stages of visual processing. Physicists, biologists and psychologists have created quantitative models of these stages, and these models enable us to quantify the encoded information. We have integrated these models as image computable software: the Image Systems Engineering Toolbox for Biology (ISETBio). The software is an extensible set of open-source modules that model the three-dimensional scene spectral radiance, retinal image formation (physiological optics), spatial sampling by the cone photoreceptor mosaic, fixational eye movements, and phototransduction. This webinar, hosted by the OSA Color Technical Group, will provide an overview of the ISETBio modules as well as examples of how to use the software to understand and model human visual performance.

Hosted By: Color Technical Group

Date and Time: 
Tuesday, July 21, 2020 - 9:00am

SCIEN and EE292E present "ThinVR: A VR display approach providing wide FOV in a compact form factor"

Topic: 
ThinVR: A VR display approach providing wide FOV in a compact form factor
Abstract / Description: 

This talk describes ThinVR: An approach to build near-eye VR displays that simultaneously provides a very wide (180 degree horizontal) FOV and a compact form factor. The key is to replace traditional large optics with curved microlens arrays and to place the optics in front of curved displays. We had to design custom heterogeneous optics to make this approach work, because many lenslets are viewed off the central axis. To ensure the existence of an adequate eyebox and to minimize pupil swim distortions, we had to design and build a custom optimizer to produce an acceptable heterogeneous lenslet array. We prove this approach works through prototypes with both static and dynamic displays. To our knowledge, this is the first work to both convincingly demonstrate and analyze the potential for curved, heterogeneous microlens arrays to enable compact, wide FOV near-eye VR displays.

Date and Time: 
Wednesday, July 15, 2020 - 4:30pm
Venue: 
Zoom - register for link + password

SCIEN and EE292E present "Fluorescence Guided Precision Surgery TM – Illuminating Tumors and Nerves"

Topic: 
Fluorescence Guided Precision Surgery TM – Illuminating Tumors and Nerves
Abstract / Description: 

Although treatment algorithms vary, surgery is the primary treatment modality for most solid cancers. In oncologic tumor resection, the preferred outcome is complete cancer removal as residual tumor left behind is considered treatment failure. However complete tumor removal needs to be balanced with functional preservation and minimizing patient morbidity including prevention of inadvertent nerve injury. The inability of surgeons to visually distinguish between tumor and normal tissue including nerves leads to residual cancer cells being left behind at the edges of resection, i.e. positive surgical margins (PSM). PSM can be as high as 20-40% in breast cancer lumpectomy, 21% for radical prostatectomy, and 13% for HNSCC. Similarly, using white light reflectance alone which is the current standard of care in operating rooms, nerve dysfunction following surgery has been reported to be as high as ~2-40% ranging from immediate post op to long-term dysfunction.

Molecular imaging with fluorescence provides enhanced visual definition between diseased and normal tissue and have been shown to decrease PSM in both animal models and patients. Molecular imaging with fluorescence can also provide enhanced visualization of important structures such as nerves to improve preservation and minimize inadvertent injury. Our laboratory has extensive experience in development of both nerve and tumor injectable markers for surgical visualization. In presentation we will discuss the development of nerve and tumor markers combinations to improve intraoperative visualization – aka Precision Surgery TM.


The SCIEN Colloquia are now offered via Zoom - Please register by going to https://stanford.zoom.us/meeting/register/tJctd-utrT4rHtT5OO34glASg1vol-PCGuXR

Date and Time: 
Wednesday, July 8, 2020 - 4:30pm
Venue: 
Zoom

SCIEN and EE292E present "Next-generation technologies to enable high-performance, low-cost lidar"

Topic: 
Next-generation technologies to enable high-performance, low-cost lidar
Abstract / Description: 

 

Lidar systems are used in diverse applications, including autonomous driving and industrial automation. Despite the challenging requirements for these systems, most lidars in the market utilize legacy technologies such as scanning mechanisms, long-coherence-length or edge-emitting lasers, and avalanche photodiodes, and consequently offer limited performance and robustness at a relatively high price point. Other systems propose to utilize esoteric technologies which face an uphill struggle towards manufacturability. This talk will describe two complementary technologies, leveraging on years of progress in adjacent industries. When combined with novel signal processing algorithms, these deliver a high-resolution, long-range and low-cost solid-state flash lidar system, breaking the performance envelope and resulting in a camera-like system. We will discuss design trade-offs, performance roadmap into the future and remaining challenges.


 

Please register by going to https://stanford.zoom.us/meeting/register/tJAtcO2gqz0iGN0p94z4JU_I9-EQuQ3lBi7N

The talks are also videotaped and posted the following week on talks.stanford.edu for Stanford students, staff, faculty and SCIEN Affiliate Member companies.

If you wish to receive announcements about future talks, please add yourself to the SCIEN distribution list by going here.

 

 

Date and Time: 
Wednesday, July 1, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Codification Design in Compressive Imaging"

Topic: 
Codification Design in Compressive Imaging
Abstract / Description: 

Compressive imaging enables faster acquisitions by capturing coded projections of the scenes. Codification elements used in compressive imaging systems include lithographic masks, gratings and micro-polarizers, which sense spatial, spectral, and temporal data. Codification plays a key role in compressive imaging as it determines the number of projections needed for correct reconstruction. In general, random coding patterns are sufficient for accurate reconstruction. Still, more recent studies have shown that code design not only yields to improved image reconstructions, but it can also reduce the number of required projections. This talk covers different tools for codification design in compressive imaging, such as the restricted isometry property, geometric and deep learning approaches. Applications in compressive spectral video, compressive X-ray computed tomography, and seismic acquisition will be also discussed.

Date and Time: 
Wednesday, June 3, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Mojo Lens, the First True Smart Contact Lens"

Topic: 
Mojo Lens, the First True Smart Contact Lens
Abstract / Description: 

After working in stealth for over 4 years, Mojo Vision recently unveiled the first working prototype of Mojo Lens, a smart contact lens designed to deliver augmented reality content wherever you look. This talk will provide an overview of the company, its vision of "Invisible Computing", the science behind the world's first contact lens display, and a first-hand account of what it's like to actually wear Mojo Lens.

Date and Time: 
Wednesday, May 27, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents "Learned Image Synthesis for Computational Displays"

Topic: 
Learned Image Synthesis for Computational Displays
Abstract / Description: 

Addressing vergence-accommodation conflict in head-mounted displays (HMDs) requires resolving two interrelated problems. First, the hardware must support viewing sharp imagery over the full accommodation range of the user. Second, HMDs should accurately reproduce retinal defocus blur to correctly drive accommodation. A multitude of accommodation-supporting HMDs have been proposed, with three architectures receiving particular attention: varifocal, multifocal, and light field displays. These designs all extend depth of focus, but rely on computationally expensive rendering and optimization algorithms to reproduce accurate defocus blur (often limiting content complexity and interactive applications). To date, no unified framework has been proposed to support driving these emerging HMDs using commodity content. In this talk, we will present DeepFocus, a generic, end-to-end convolutional neural network designed to efficiently solve the full range of computational tasks for accommodation-supporting HMDs. This network is demonstrated to accurately synthesize defocus blur, focal stacks, multilayer decompositions, and multiview imagery using only commonly available RGB-D images, enabling real-time, near-correct depictions of retinal blur with a broad set of accommodation-supporting HMDs.

Date and Time: 
Wednesday, May 20, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN presents Practical 2D to 3D Image Conversion

Topic: 
Practical 2D to 3D Image Conversion
Abstract / Description: 

We will discuss techniques for converting 2D images to 3D meshes on mobile devices. This includes methods to efficiently compute both dense and sparse depth maps, converting depth maps into 3D meshes, mesh inpainting, and post-processing. We focus on different CNN designs to solve each step in the processing pipeline and examine common failure modes. Finally, we will look at practical deployment of image processing algorithms and CNNs on mobile apps, and how Lucid uses cloud processing to balance processing power with latency.

Date and Time: 
Wednesday, May 13, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "Bio-inspired depth sensing using computational optics"

Topic: 
Bio-inspired depth sensing using computational optics
Abstract / Description: 

Jumping spiders rely on accurate depth perception for predation and navigation. They accomplish depth perception, despite their tiny brains, by using specialized optics. Each principal eye includes a multitiered retina that simultaneously receives multiple images with different amounts of defocus, and distance is decoded from these images with seemingly little computation. In this talk, I will introduce two depth sensors that are inspired by jumping spiders. They use computational optics and build upon previous depth-from-defocus algorithms in computer vision. Both sensors operate without active illumination, and they are both monocular and computationally efficient.
The first sensor synchronizes an oscillating deformable lens with a photosensor. It produces depth and confidence maps at more than 100 frames per second and has the advantage of being able to extend its working range through optical accommodation. The second sensor uses a custom-designed metalens, which is an ultra-thin device with 2D nano-structures that modulate traversing light. The metalens splits incoming light and simultaneously forms two differently-defocused images on a planar photosensor, allowing the efficient computation of depth and confidence from a single snapshot in time.

Date and Time: 
Wednesday, May 6, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

SCIEN Seminar presents "Insight into the inner workings of Intel’s Stereo and Lidar Depth Cameras"

Topic: 
Insight into the inner workings of Intel’s Stereo and Lidar Depth Cameras
Abstract / Description: 

This talk will provide an overview of the technology and capabilities of Intel's RealSense Stereo and Lidar Depth Cameras, and will then progress to describe new features, such as high-speed capture, multi-camera enhancements, optical filtering, and near-range high-resolution depth imaging. Finally, we will introduce a new fast on-chip calibration method that can be used to improve the performance of a stereo camera and help mitigate some common stereo artifacts.

Date and Time: 
Wednesday, April 29, 2020 - 4:30pm
Venue: 
Zoom (join SCIEN mail list to receive meeting ID)

Pages

Subscribe to RSS - SCIEN Talk