The technological advances in immersive telepresence are greatly impeded by the challenges that need to be met when mediating the realistic feeling of presence in a remote environment to a local human user. Providing a stereoscopic 360° visual representation of the distant scene further fosters the level of realism and greatly improves task performance. State-of-the-art technology has primarily developed catadioptric or multi-camera systems to address this issue. Current solutions are bulky, not realtime capable, and tend to produce erroneous image content due to the stitching processes involved, which are prone to perform poorly for texture-less scenes. In this talk, I will introduce a vision on-demand approach that creates stereoscopic scene information upon request. A real-time capable camera system along with a novel deep learning-based delay-compensation paradigm will be presented that provides instant visual feedback for highly immersive telepresence.
Dr.-Ing. Tamay Aykut studied Electrical Engineering and Information Technology at the Technical University of Munich (TUM) and obtained his B.S and M.S. degrees in 2013 and 2016, respectively. In March 2016, he joined the Chair of Media Technology at TUM as a research associate, where he received an Engineering Doctorate in 2019 and continued his engagement as postdoctoral fellow until August 2019. His research was awarded with Nokia Bell Lab's student award in 2018, and the Kurt Fischer Doctorate Prize from TUM in 2019. He was selected for the Junior Research Group program by the Max Planck Center for Visual Computing and Communication (MPC-VCC). In September 2019, he joined the department of Electrical Engineering at Stanford University as Visiting Assistant Professor under the mentorship of Prof. Bernd Girod. His research interests include artificial intelligence, visual computing and communication, and robotics.