Amin Arbabian and team enable cameras to see in 3D
With a simple design and some clever engineering, researchers devised a high-frequency, low-power, compact optical device that allows virtually any digital camera to perceive depth.
Professor Amin Arbabian, Okan Atalar (EE PhD candidate), and fellow researchers have created a new approach that allows standard image sensors to see light in three dimensions. Common cameras could soon be used to measure the distance to objects.
The solution that the Stanford team, a collaboration between the Laboratory for Integrated Nano-Quantum Systems (LINQS) and ArbabianLab, came up with relies on a phenomenon known as acoustic resonance. The team built a simple acoustic modulator using a thin wafer of lithium niobate – a transparent crystal that is highly desirable for its electrical, acoustic and optical properties – coated with two transparent electrodes.
“Existing lidar systems are big and bulky, but someday, if you want lidar capabilities in millions of autonomous drones or in lightweight robotic vehicles, you’re going to want them to be very small, very energy efficient, and offering high performance,” explains EE PhD candidate Okan Atalar, first author on the new paper in the journal Nature Communications that introduces this compact, energy-efficient device that can be used for lidar.
Read full story “Stanford engineers enable simple cameras to see in 3D”