This Tech Note column appeared in the December 2020 issue as "Cut the Clutter." Subscribe to Discover magazine for more stories like this.
One rainy night in March 2007, graduate student Ralph Simon found himself alone in the Cuban rainforest. He was following a hunch, based on a picture he’d seen in a magazine. He was after a specific dish-shaped leaf, which belonged to the native Marcgravia evenia vine. The leaves looked like they’d be ideal for reflecting sound, and Simon suspected they would efficiently lure bat pollinators to their flowers in the dark. His adviser was skeptical without proof, so there Simon was, sitting among the creepy crawlers with his infrared video camera and a stash of snacks, waiting for the bats to come. And come they did, several times an hour, for the entire night.
In the years since, Simon — now a sensory ecologist at the University of Antwerp in Belgium — has returned to this same spot at least three more times to gather leaf specimens and test how sound ricochets off them to attract bats. These days, though, he’s using his knowledge to develop technologies that help robots navigate with sound.
Most autonomous mobile robots employ a suite of sophisticated sensors to maneuver. Sonar technology helps them avoid obstacles through echolocation — pulses of sound bouncing off the closest objects. It’s relatively inexpensive and useful in low-visibility environments where cameras might fail. However, surrounding objects reflect a barrage of distracting signals, known as echo clutter, which can be challenging for robots to sift through.
By comparison, bats can easily extract meaning from a volley of returning echoes to map new environments in real time. But their methods are difficult to mimic, leaving researchers like Simon to hunt for new ways to help robots better sort through that clutter.
Building Beacons
Several types of artificial landmarks already help guide autonomous robots. Underwater, for example, simple acoustic reflectors direct aquatic robots that are equipped with sonar sensors. But few research groups have investigated how to make acoustic markers on land — currently, none exist to help robots navigate amid the clutter of above-ground echoes.
Simon’s rainforest excursions have led him to a new solution that could unlock sonar’s navigational potential: 3D-printed acoustic reflectors shaped like M. evenia leaves. While some vegetation merely returns a twinkle of sound, M. evenia’s leaves reflect a consistent pattern of echoes that entice bats to its flowers in the darkness — like a blinking lighthouse directing wayward ships.
In 2006, Simon and his research team demonstrated that changing the size of hollow, hemispheric, leaflike structures altered their returning echoes, and that bats could discern these subtle variations. Five years later, the group found that M. evenia was particularly effective at reflecting clear, recognizable acoustic signals. The vine’s dish-shaped leaves boomeranged a long-range echo with a unique signature that remained consistent regardless of the bats’ direction of approach. The leaves were such effective acoustic beacons that they cut their pollinators’ search time in half, despite the surrounding clutter. So, the team decided to create their own reflectors of varying sizes to see if an autonomous robot could use the same principles to navigate.
Leaves in the Lab
In a study published in January, Simon and his team made 3D-printed plastic leaves modeled after M. evenia. They tinkered with the shape and depth to strengthen the echoes bouncing off the reflectors, using several to direct a simple autonomous robot through an unfamiliar environment. They installed 126 plastic leaves to create a cacophony of echoes — simulating even more clutter than autonomous machines experience when navigating the outside world.
The researchers trained their robot’s algorithms to recognize leaf reflectors of various sizes, much like how a bat discerns objects of various shapes by their echoes. Each reflector type conveyed an instruction, directing the robot to turn, stop or switch a light on.
As their knee-high robot moved on three wheels through the lab, making batlike calls, the echoes from the reflectors shone like little lighthouses. Sounds bouncing off the plastic plates, however, rebounded chaotically, like lights glinting off a spinning disco ball. But the robot was able to discern the important echoes from the M. evenia-inspired reflectors and make out the navigational cues, despite the cacophony.
Simon says the study demonstrates how basic ecology research can advance navigational technology. The reflectors could aid autonomous robots in confined spaces, like dusty greenhouses or dark mines, where visual systems are impaired.
“Sonar sensors nowadays are only used mainly for ranging,” he adds. “But they could also do much more.”
According to Jan Steckel, electrical engineer at the University of Antwerp and co-author of the study, the simplicity and saliency of their reflectors has “cut out the whole echo clutter problem.” The reflectors, he says, contrast background distractions and stand out like a red pebble against a sea of black stones.
Despite these advances, a deeper mystery about how sonar navigation really works in nature lurks behind Simon and Steckel’s research, and other efforts like it.
How Do Bats Do It?
There’s an ongoing debate about precisely how bats use echolocation to perceive and move through their environment. Do they simply recognize the echoes bouncing off specific objects, or can they reconstruct a more detailed 3D layout? Perhaps, some researchers argue, it’s a combination of both.
How bats use sonar to navigate is “the million-dollar question in echolocation,” says Yossi Yovel, a biologist at Tel Aviv University in Israel and co-creator of the batlike robot Robat.
His preliminary research suggests that building robots that use deep-learning algorithms may help us understand what information bats extract from sonic data. After all, these neural networks mimic something bats have but robots do not: a brain.
And while neural networks are a powerful tool, they might not be an airtight solution to mimicking the brains of master echolocators. Dolphin sonar, for example, has been studied for decades, but the mammals’ natural abilities continue to outperform their human-made counterparts, especially in cluttered environments.
Such has been the observation of Yan Pailhas, a scientist at the Centre for Maritime Research and Experimentation in Italy, who has developed dolphin-inspired sonar systems and fashioned underwater sonar landmarks. Despite recent advances, he says neural networks still can’t compete with the way dolphins interpret sensory data. “They’ve got a brain,” he says. “And that’s the trick.”
Unravelling echolocation in nature is a puzzle that scientists have yet to solve. Herbert Peremans of the University of Antwerp, a co-author on Simon and Steckel’s reflector study, says he would be happy if he could just duplicate what bats are doing. “I consider nature to be an engineer,” he adds. As one himself, he knows what he can glean from fellow inventors, Homo sapiens or not.
He’s proud of the group’s reflectors because they offer a straightforward answer to the echo clutter problem, and could aid artificial sonar in outdoor environments. “I think it makes sense to look at naturally evolved solutions, because they usually are very simple,” Peremans says. “Elegant, but simple.”
Raleigh McElvery is a science writer who covers biology and neuroscience. She lives in Cambridge, Massachusetts.