The distance between the real and the digital is clearest at the interface layer. The ways that our bodies interact with the physical world are rich and elaborate while digital interactions are far more limited. Through an increased level of direct and intuitive interaction, my work aims to raise computing devices from external systems that require deliberate usage to those that are truly an extension of us, advancing both the state of research and human ability. My approach is to use the entire body for input and output, to allow for implicit and natural interactions. I call my concept "perceptual engineering," i.e., a method to alter the user's perception (or more specifically the input signals to their perception) and manipulate it in subtle ways. For example, modifying a user's sense of space, place, balance and orientation or manipulating their visual attention, all without the user's explicit input, and in order to assist or guide their interactive experience in an effortless way.
I build devices and immersive systems that explore the use of cognitive illusions to manage attention, physiological signals for interaction, deep learning for automatic VR generation, embodiment for remote collaborative learning, tangible interaction for augmenting play, haptics for enhancing immersion, and vestibular stimulation to mitigate motion sickness in VR. My "perceptual engineering" approach has been shown to, (1) support implicit and natural interactions with haptic feedback, (2) induce believable physical sensations of motion in VR, (3) provide a novel way to communicate with the user through proprioception and kinesthesia, and (4) serve as a platform to question the boundaries of our sense of agency and trust. For decades, interaction design has been driven to answer the question: how can new technologies allow users to interact with digital content in the most natural way? If we look at the evolution of computing over the last 50 years, interaction has gone from punch cards to mouse and keyboard to touch and voice. Similarly, devices have become smaller and closer to the user's body. With every transition, the things people can do have become more personal. The main question that drives my research is: what is the next logical step?
Misha Sra is a Research Affiliate at the MIT Media Lab working with Prof. Pattie Maes. She graduated in Summer 2018 from the MIT Media Lab having done her PhD under the direction of Prof. Pattie Maes in the Fluid Interfaces Group. Misha's work asks the question: how can new technologies allow users to interact with digital content in the most natural way? Misha's work is published at the most selective HCI and VR venues such as ACM CHI, UIST, VRST and IEEE VR where she has received several best paper awards and honorable mentions for her work. From 2014-2015, she was a Robert Wood Johnson Foundation wellbeing research fellow at the MIT Media Lab. In spring 2016, she received the Silver Award in the annual Edison Awards global competition that honors excellence in human-centered design and innovation. She is a TEDx speaker and was recently selected as a Rising Star in EECS by MIT. Her work has captured the interest of media such as, MIT Tech Review, Discovery Channel, Techradar, UploadVR and Engadget.