Mobile photography has been transformed by software. While sensors and lens design have improved over time, the mobile phone industry relies increasingly on software to mitigate physical limits and the constraints imposed by industrial design. In this talk, I'll describe the HDR+ system for burst photography, comprising robust and efficient algorithms for capturing, fusing, and processing multiple images into a single higher-quality result. HDR+ is core imaging technology for Google's Pixel phones - it's used in all camera modes and powers millions of photos per day. I'll give a brief history of HDR+ starting from Google Glass (2013), present key algorithms from the HDR+ system, then describe the new features that enable the recently released Night Sight mode.
Sam Hasinoff is a software engineer at Google. Before joining Google in 2011, he was an Research Assistant Professor at the Toyota Technological Institute at Chicago (TTIC), a philanthropically endowed academic institute on the campus of the University of Chicago. From 2008-2010, he was a postdoctoral fellow at the Massachusetts Institute of Technology, supported in part by the National Sciences and Engineering Research Council of Canada. He received the BSc degree in computer science from the University of British Columbia in 2000, and the MSc and PhD degrees in computer science from the University of Toronto in 2002 and 2008, respectively. In 2006, he received an honorable mention for the Longuet-Higgins Best Paper Award at the European Conference on Computer Vision. He is the recipient of the Alain Fournier Award for the top Canadian dissertation in computer graphics in 2008.