SCIEN Talk

SCIEN Talk: Next Generation Wearable AR Display Technologies

Topic: 
Next Generation Wearable AR Display Technologies
Abstract / Description: 

Wearable AR/VR displays have a long history and earlier efforts failed due to various limitations. Advances in sensors, optical technologies, and computing technologies renewed the interest in this area. Most people are convinced AR will be very big. A key question is whether AR glasses can be the new computing platform and replace smart phones? I'll discuss some of the challenges ahead. We have been working on various wearable display architectures and I'll discuss our efforts related to MEMS scanned beam displays, head-mounted projectors and smart telepresence screens, and holographic near-eye displays.

Date and Time: 
Wednesday, November 29, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Near-Eye Varifocal Augmented Reality Displays

Topic: 
Near-Eye Varifocal Augmented Reality Displays
Abstract / Description: 

With the goal of registering dynamic synthetic imagery onto the real world, Ivan Sutherland envisioned a fundamental idea to combine digital displays with conventional optical components in a wearable fashion. Since then, various new advancements in the display engineering domain, and a broader understanding in the vision science domain have led us to computational displays for virtual reality and augmented reality applications. Today, such displays promise a more realistic and comfortable experience through techniques such as lightfield displays, holographic displays, always-in-focus displays, multiplane displays, and varifocal displays. In this talk, as an Nvidian, I will be presenting our new optical layouts for see-through computational near-eye displays that is simple, compact, varifocal, and provides a wide field of view with clear peripheral vision and large eyebox. Key to our efforts so far contain novel see-through rear-projection holographic screens, and deformable mirror membranes. We establish fundamental trade-offs between the quantitative parameters of resolution, field of view, and the form-factor of our designs; opening an intriguing avenue for future work on accommodation-supporting augmented reality display.

Date and Time: 
Wednesday, November 15, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN & EE292E seminar: Interactive 3D Digital Humans

Topic: 
Interactive 3D Digital Humans
Abstract / Description: 

This talk will cover recent methods for recording and displaying interactive life-sized digital humans using the ICT Light Stage, natural language interfaces, and automultiscopic 3D displays. We will then discuss the first full application of this technology to preserve the experience of in-person interactions with Holocaust survivors

More Information: http://gl.ict.usc.edu/Research/TimeOffsetConversations/


The SCIEN Colloquia are open to the public. The talks are also videotaped and posted the following week on talks.stanford.edu.

There will a reception following the presentation.

Date and Time: 
Wednesday, November 8, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Mapping molecular orientation using polarized light microscopy

Topic: 
Mapping molecular orientation using polarized light microscopy
Abstract / Description: 

Polarization is a basic property of light, but the human eye is not sensitive to it. Therefore, we don't have an intuitive understanding of polarization and of optical phenomena that are based on it. They either elude us, like the polarization of the blue sky or the rainbow, or they puzzle us, like the effect of Polaroid sunglasses. Meanwhile, polarized light plays an important role in nature and can be used to manipulate and analyze molecular order in materials, including living cells, tissues, and whole organisms, by observation with the polarized light microscope.

In this seminar, Rudolf Oldenbourg will first illustrate the nature of polarized light and its interaction with aligned materials using hands-on demonstrations. He will then introduce a modern version of the polarized light microscope, the LC-PolScope, created at the MBL. Enhanced by liquid crystal devices, electronic imaging, and digital image processing techniques, the LC-PolScope reveals and measures the orientation of molecules in every resolved specimen point at once. In recent years, his lab expanded the LC-PolScope technique to include the measurement of polarized fluorescence of GFP and other fluorescent molecules, and applied it to record the remarkable choreography of septin proteins during cell division, displayed in yeast to mammalian cells.

Talon Chandler will then discuss extending polarized light techniques to multi-view microscopes, including light sheet and light field microscopes. In contrast to traditional, single-view microscopy, the recording of specimen images along two or more viewing directions allows us to unambiguously measure the three dimensional orientation of molecules and their aggregates. Chandler will discuss ongoing work on optimizing the design and reconstruction algorithms for multi-view polarized light microscopy.


The SCIEN Colloquia are open to the public. The talks are also videotaped and posted the following week on talks.stanford.edu.

There will a reception following the presentation.

Date and Time: 
Wednesday, November 1, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN colloquium: Light field Retargeting for Integral and Multi-panel Displays

Topic: 
Light field Retargeting for Integral and Multi-panel Displays
Abstract / Description: 

Light fields are a collection of rays emanating from a 3D scene at various directions, that when properly captured provides a means of projecting depth and parallax cues on 3D displays. However due to the limited aperture size and the constrained spatial-angular sampling of many light field capture systems (e.g. plenoptic cameras), the displayed light fields provide only a narrow viewing zone in which parallax views can be supported. In addition, the autostereoscopic displaying devices may be of unmatched spatio-angular resolution (e.g. integral display) or of different architecture (e.g. multi-panel display) as opposed to the capturing plenoptic system which requires careful engineering between the capture and display stages. This talk presents an efficient light field retargeting pipeline for integral and multi-panel displays which provides us with a controllable enhanced parallax content. This is accomplished by slicing the captured light fields according to their depth content, boosting the parallax, and merging these slices with data filling. In integral displays, the synthesized views are simply resampled and reordered to create elemental images that beneath a lenslet array can collectively create multi-view rendering. For multi-panel displays, additional processing steps are needed to achieve seamless transition over different depth panels and viewing angles where displayed views are synthesized and aligned dynamically according to the position of the viewer. The retargeting technique is simulated and verified experimentally on actual integral and multi-panel displays.

Date and Time: 
Wednesday, October 25, 2017 - 4:30pm
Venue: 
Packard 101

Holographic Near-Eye Displays for Virtual and Augmented Reality

Topic: 
Holographic Near-Eye Displays for Virtual and Augmented Reality
Abstract / Description: 

Today's near-eye displays are a compromise of field of view, form factor, resolution, supported depth cues, and other factors. There is no clear path to obtain eyeglasses-like displays that reproduce the full fidelity of human vision. Computational displays are a potential solution in which hardware complexity is traded for software complexity, where it is easier to meet many conflicting optical constraints. Among computational displays, digital holography is a particularly attractive solution that may scale to meet all the optical demands of an ideal near-eye display. I will present novel designs for virtual and augmented reality near-eye displays based on phase-only holographic projection. The approach is built on the principles of Fresnel holography and double phase amplitude encoding with additional hardware, phase correction factors, and spatial light modulator encodings to achieve full color, high contrast and low noise holograms with high resolution and true per-pixel focal control. A unified focus, aberration correction, and vision correction model, along with a user calibration process, accounts for any optical defects between the light source and retina. This optical correction ability not only to fixes minor aberrations but enables truly compact, eyeglasses-like displays with wide fields of view (80 degrees) that would be inaccessible through conventional means. All functionality is evaluated across a series of proof-of-concept hardware prototypes; I will discuss remaining challenges to incorporate all features into a single device and obtain practical displays.

Date and Time: 
Wednesday, October 18, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Focal Surface Displays

Topic: 
Focal Surface Displays
Abstract / Description: 

Conventional binocular head-mounted displays (HMDs) vary the stimulus to vergence with the information in the picture, while the stimulus to accommodation remains fixed at the apparent distance of the display, as created by the viewing optics. Sustained vergence-accommodation conflict (VAC) has been associated with visual discomfort, motivating numerous proposals for delivering near-correct accommodation cues. We introduce focal surface displays to meet this challenge, augmenting conventional HMDs with a phase-only spatial light modulator (SLM) placed between the display screen and viewing optics. This SLM acts as a dynamic freeform lens, shaping synthesized focal surfaces to conform to the virtual scene geometry. We introduce a framework to decompose target focal stacks and depth maps into one or more pairs of piecewise smooth focal surfaces and underlying display images. We build on recent developments in "optimized blending" to implement a multifocal display that allows the accurate depiction of occluding, semi-transparent, and reflective objects. Practical benefits over prior accommodation-supporting HMDs are demonstrated using a binocular focal surface display employing a liquid crystal on silicon (LCOS) phase SLM and an organic light-emitting diode (OLED) display.

Date and Time: 
Wednesday, October 11, 2017 - 4:30pm
Venue: 
Packard 101

SCIEN Talk: Computational Near-Eye Displays

Topic: 
Computational Near-Eye Displays
Abstract / Description: 

Virtual reality is a new medium that provides unprecedented user experiences. Eventually, VR/AR systems will redefine communication, entertainment, education, collaborative work, simulation, training, telesurgery, and basic vision research. In all of these applications, the primary interface between the user and the digital world is the near-eye display. While today's VR systems struggle to provide natural and comfortable viewing experiences, next-generation computational near-eye displays have the potential to provide visual experiences that are better than the real world. In this talk, we explore the frontiers of VR/AR systems engineering and discuss next-generation near-eye display technology, including gaze-contingent focus, light field displays, monovision, holographic near-eye displays, and accommodation-invariant near-eye displays.

Date and Time: 
Wednesday, October 4, 2017 - 4:30pm
Venue: 
Packard 101

Computational Imaging for Robotic Vision [SCIEN]

Topic: 
Computational Imaging for Robotic Vision
Abstract / Description: 

This talk argues for combining the fields of robotic vision and computational imaging. Both consider the joint design of hardware and algorithms, but with dramatically different approaches and results. Roboticists seldom design their own cameras, and computational imaging seldom considers performance in terms of autonomous decision-making.The union of these fields considers whole-system design from optics to decisions. This yields impactful sensors offering greater autonomy and robustness, especially in challenging imaging conditions. Motivating examples are drawn from autonomous ground and underwater robotics, and the talk concludes with recent advances in the design and evaluation of novel cameras for robotics applications.

Date and Time: 
Wednesday, June 7, 2017 - 4:30pm
Venue: 
Packard 101

Carl Zeiss Smart Glasses [SCIEN Talk]

Topic: 
Carl Zeiss Smart Glasses
Abstract / Description: 

Kai Stroeder, Managing Director at Carl Zeiss Smart Optics GmbH, will talk about the Carl Zeiss Smart Glasses.

This will be an informal session with an introduction and prototype demo of the Smart Glasses and an open discussion about future directions and applications.

Date and Time: 
Tuesday, May 30, 2017 - 10:00am
Venue: 
Lucas Center for Imaging, P083

Pages

Subscribe to RSS - SCIEN Talk