Professor Amin Arbabian, Aidan Fitzpatrick (PhD candidate), and Ajay Singhvi (PhD candidate) have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.
The researchers envision their hybrid optical-acoustic system one day being used to conduct drone-based biological marine surveys from the air, carry out large-scale aerial searches of sunken ships and planes, and map the ocean depths with a similar speed and level of detail as Earth's landscapes. Their "Photoacoustic Airborne Sonar System" is detailed in a recent study published in the journal IEEE Access.
"Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth's landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water," reports Amin. "Our goal is to develop a more robust system which can image even through murky water."
Excerpted from "Stanford engineers combine light and sound to see underwater", Stanford News, November 30, 2020