New Article published in PLOS Computational Biology by Yannis Paschalidis describes bio-inspired algorithms for aerial drone flight

A rather unusual situation recently unfolded inside a laboratory—moths playing a “video game,” flitting their wings as they navigated through a virtual forest displayed on a projector screen.

Each of the moths’ movements was being tracked by Boston University engineers and University of Washington biologists. The team is leveraging the moth data to develop new navigational programs that provide autonomous aerial drones with a better sense of direction. The findings of their latest study, funded partly by a $7.5 million Department of Defense Multidisciplinary University Research Initiative (MURI) grant to develop self-navigating vehicles that can traverse land, sea, and air, were recently published online in PLOS Computational Biology.

The navigational challenge the moths helped the team overcome? Engineers working on control programs for autonomous vehicles have long struggled with what they call “the curse of dimensionality,” which is that a drone navigating a multidimensional world has such an overwhelming number of options to consider at any given time (which direction it travels, how far, how fast, and so on) that it struggles to determine the best paths to take. Robots just aren’t naturals at making those kinds of decisions—but living beings are.

“Humans and animals are ideal navigators; they’re our experts,” says Yannis Paschalidis, a BU College of Engineering professor of electrical and computer, biomedical, and systems engineering, and a senior author on the new study.

“They can learn very fast and navigate quickly in very complex environments,” he says. “And if we can observe them and understand the [navigational strategies] that they are using, we can take these as our starting point. Then, with less computational effort, we can adapt those strategies to fit any new situation and any new drone that needs to navigate in a certain environment.”

There’s more than one way to navigate through an environment, but Paschalidis, who is also director of BU’s Center for Information and Systems Engineering, and collaborators found that moths, like many other species, primarily rely on a visual perception pattern called optical flow. That means that as moths are flying, they sense their location by keeping track of the way their surroundings appear to move around them. Humans use this strategy, too; think of when you’re in a car, approaching a sign along the road. At first, the sign is ahead of you, but as you drive past it, it seems to slide behind you. The sign hasn’t physically moved from its location, but it has moved within your visual field, and this apparent movement tells your brain how fast you’re going and where you’re positioned relative to the sign.

Yet on its own, the principles of optical flow aren’t enough to safely guide an autonomous drone around obstacles (or moths, it seems, who are notorious for incinerating themselves against burning hot lights). So Paschalidis and the team, including UW neurobiologist Thomas Daniel, merged their learnings from analyzing moths with another navigational strategy that maps the specific location of obstacles. An animal like a bobcat, for instance, uses an obstacle detection strategy to analyze a broad area of forest, note where the trees are located, and plan out a path that avoids bumping into trees.

The researchers created two navigational programs to try out in drones in computer simulation: one program that encapsulated the optical flow navigation strategy of a moth; and a second, enhanced program that combined the moth’s strategy with obstacle detection strategy. Using the programs, they challenged a simulated drone to navigate a variety of virtual forests.

The enhanced program navigated more effectively than the moth’s strategy alone, but with a big catch: the enhanced program had to be readjusted to perform optimally in each new forest simulation. In contrast, the moth’s strategy was more adaptable to new environments. While it didn’t select the absolute best path through the different forest environments, its strategy could perform better across a wide range of scenarios without needing to be adjusted by human input.

“This robustness is driven by the fact that the [moth’s navigational strategy] isn’t optimized to do exceptionally well in a very specific setting. The strategy does well in many different settings simply because these animals need to be adaptable in order to survive,” Paschalidis says.

Today’s aerial drones are often optimized to perform specific missions with known parameters, but they aren’t able to self-navigate unknown landscapes. Autonomous drones still need refining, but instilling them with more adaptable navigational strategies from living organisms could someday allow them to serve us in a whole new variety of contexts, from agricultural pursuits to remote rescue missions, and in uncharted rural areas as well as dynamic, obstacle-ridden cities.

“We’re hoping that with this new framework we have developed, we can observe other animals,” Paschalidis says. The BU researchers, in collaboration with BU’s Center for Systems Neuroscience, also plan to explore the neural underpinnings of living creatures’ navigational strategies by recording brain signals via electrodes and—in the case of humans—by using noninvasive, functional MRI techniques.

“Together with behavioral observations, if we can understand what is happening in the brain, the hope is that we’ll get a clearer picture of how we and other species are navigating in complex terrains,” Paschalidis says. “Then we’ll be able to take these lessons, apply them, and extract [navigational strategies] that would lead to more autonomous, more adaptable robot systems.”

Original article published in The Brink by Kerry Benson