For the past decade, display and sensor hardware developments for mixed reality and smart glasses were merely a shot in the dark, providing enough display immersion and visual comfort for developers to build up apps, especially for the enterprise field. On the sensor side, emphasis was put on 6DOF head tracking and spatial mapping, gesture sensing and later eye tracking. Today, as universal use cases for consumer emerge such as co-presence, digital twin and remote conferencing, new requirements are expressed in the product requirement documents (PRD) to enable such experiences, both on the display and sensing side. It is not only a race to smaller form factor and light weight devices for large field of view (FOV) and lower power, but the requirements are also on additional display and sensing features specifically tuned to implement such new universal use cases. Broad acceptance of wearable displays especially in the consumer field is contingent on enabling these new display and sensing requirements in small form factors and low power.