Active 3D imaging systems, such as LIDAR, are becoming increasingly prevalent for applications in autonomous vehicle navigation, remote sensing, human-computer interaction, and more. These imaging systems capture distance by directly measuring the time it takes for short pulses of light to travel to a point and return. With emerging sensor technology we can detect down to single arriving photons and identify their arrival at picosecond timescales, enabling new and exciting imaging modalities. In this talk, I discuss trillion-frame-per-second imaging, efficient depth imaging with sparse photon detections, and imaging objects hidden from direct line of sight.
David is a PhD student in the Stanford Computational Imaging Lab. He received his bachelor's and master's degrees in EE from Brigham Young University (BYU), where he worked on satellite remote sensing of sea ice and soil moisture. His current research involves developing new computational algorithms for non-line-of-sight imaging, single-photon imaging, and 3D imaging with sensor fusion.