Source: The Conversation – USA – By Nitin Sanket, Assistant Professor of Robotics Engineering, Worcester Polytechnic Institute

To help small aerial robots navigate in the dark and other low-visibility environments, my colleagues and I developed an ultrasound-based perception system inspired by bat echolocation.
Current robots rely heavily on cameras or light detection and ranging, known as lidar, or both. But these sensors fail in visually challenging conditions, such as smoke, fog, dust, snow or complete darkness.
I’m a scientific engineer who develops bio-inspired microrobots. To solve this challenge, my research team looked at nature’s experts at navigating in poor visibility: bats. They thrive in dark, damp and dusty caves and can detect obstacles as thin as a human hair using echolocation while weighing as little as two paper clips. They emit sound waves and listen to weak echoes reflected from objects.
However, enabling this sensing on aerial robots is extremely challenging because propellers generate a lot of noise. It is a bit like trying to listen to your friend while a jet engine is taking off next to you.
To overcome this issue, we present two key ideas. First, a physical acoustic shield inspired by bat’s ear cartilage reduces propeller noise around the acoustic sensors, which act like the robot’s ears. Second, a neural network called Saranga recovers weak echo signals from very noisy measurements by learning patterns over time, inspired by how bats process sound.
Together, these enable the robot to estimate obstacle locations in 3D and navigate safely using milliwatt-level sensing power.

Nitin Sanket
Why it matters
These types of drones are very useful for search and rescue, especially in confined, dynamic and dangerous environments, because they are small and inexpensive. Search-and-rescue operations often happen in environments where visibility is very poor, such as forest fires, collapsed buildings, caves or dusty outdoor conditions. In these scenarios, traditional sensors like cameras and lidar often become unreliable.
Bats do not rely only on vision and instead use echolocation to perceive the world. Ultrasound sensing doesn’t depend on lighting conditions and works in smoke, dust and darkness.
Our work shows that it is possible to bring this capability to aerial robots despite strong onboard propeller noise. Sonar boosted by noise shielding and machine learning promises to enable a new class of small, low-cost robots that can operate in environments where current systems fail.
This research can enable highly functional, autonomous, tiny aerial robots for critical humanitarian applications, such as search and rescue, combating poaching and cave exploration. AI-enabled sonar navigation could lead to safer, faster and more cost-effective robots for time-sensitive operations where human or larger helicopter access is limited. This is a step toward being able to deploy swarms of aerial robots, much like groups of bats, to explore hazardous environments and search for survivors.
Breakthroughs in mathematical modeling, neural network design and sensor characterization will enable other low-power applications for these drones, such as environmental monitoring. Our work can reduce power by 1,000 times, weight by 10 times and cost by 100 times compared to current solutions.
What other research is being done
Most aerial navigation systems rely on cameras, depth sensors or lidar, which degrade in low visibility. Radar works in these conditions but is power-intensive for small drones. Prior work has explored ultrasound sensing mainly on ground robots, but applying it to aerial robots has been difficult due to propeller noise and weak signals.
What’s next
We are working on improving flying speed, sensing range and system size. We are also exploring new bio-inspired designs and combining ultrasound with other types of sensing.
Ultimately, our goal is to build reliable, low-power aerial robots that can operate reliably in dynamic environments and enable real-world deployment in search and rescue.
The Research Brief is a short take on interesting academic work.
![]()
Nitin Sanket receives funding from the National Science Foundation under CMMI 2516439 (https://www.nsf.gov/awardsearch/show-award?AWD_ID=2516439).
– ref. Ultralightweight sonar plus AI lets tiny drones navigate like bats – https://theconversation.com/ultralightweight-sonar-plus-ai-lets-tiny-drones-navigate-like-bats-279287
