Image
prof Amin Arbabian

Amin Arbabian and PhD candidates Aidan Fitzpatrick and Ajay Singhvi combine light and sound to see underwater

Summary

The “Photoacoustic Airborne Sonar System” could be installed beneath drones to enable aerial underwater surveys and high-resolution mapping of the deep ocean.

Dec
2020

Professor Amin Arbabian, Aidan Fitzpatrick (PhD candidate), and Ajay Singhvi (PhD candidate) have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.

The researchers envision their hybrid optical-acoustic system one day being used to conduct drone-based biological marine surveys from the air, carry out large-scale aerial searches of sunken ships and planes, and map the ocean depths with a similar speed and level of detail as Earth's landscapes. Their "Photoacoustic Airborne Sonar System" is detailed in a recent study published in the journal IEEE Access.

"Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth's landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water," reports Amin. "Our goal is to develop a more robust system which can image even through murky water."

 

Excerpted from "Stanford engineers combine light and sound to see underwater", Stanford News, November 30, 2020

Published : Jul 15th, 2022 at 01:51 pm
Updated : Jul 15th, 2022 at 01:55 pm