Learning From Animal Behaviors to Inform Control Systems

By Margo Stanton

CISE Affiliate and Distinguished Professor of Engineering John Baillieul (ME, ECE, SE)

You may be familiar with the term “blind as a bat”, which is used to describe someone who has poor eyesight. However, recent research on animal behavior by CISE affiliate and Distinguished Professor of Engineering John Baillieul (ME, ECE, SE) found that this phrase is a misconception and many species of bats rely on visual perception for navigation. While echolocation is a primary sensory modality for most bat species, for some species, in free flight, vision is used together with echolocation as secondary backup when they’re alarmed or to confirm visual cues. Ballieul’s research largely centers around learning from animal behaviors, like bat navigation, and applying the findings to control systems. 

“This research questions how animals operate and how you might use animal behaviors to design control systems,” Professor Ballieul explains. “We are interested in understanding how the brain and neuromuscular systems, which have been refined by nature, work and using these insights as new models to think about control.”

Baillieul began investigating bio-inspired navigation as an investigator on a number of projects funded by the Office of Naval Research (ONR), which has been driving research behind animal-inspired control systems for the last few decades. 

In 2010, Ballieul was PI on the five-year, $7.5 million ONR Multidisciplinary University Research Initiative (MURI) project titled Animal Inspired Flight with Outer and Inner Loop Strategies (AIRFOILS). In collaboration with researchers at the University of Washington, the University of Maryland, and the University of North Carolina at Chapel Hill, Ballieul and the team researched the flight capabilities of bats to translate across a range of environments. AIRFOILS aimed to translate these biological capabilities into navigation strategies for flight vehicles.

Building on those discoveries, Ballieul is now a Co-PI on the current ONR MURI grant entitled Neuro-Autonomy: Neuroscience-Inspired Perception, Navigation, and Spatial Awareness for Autonomous Robots, led by PI Professor Yannis Paschalidis, Director, Rafik B. Hariri Institute for Computing and Computational Science & Engineering, Distinguished Professor of Engineering (ECE, SE, BME), Founding Member and Faculty (CDS). The project takes advantage of neurophysiological characteristics in living organisms to establish next-generation perception and navigation in AVs. Neuro-Autonomy researchers are analyzing the visual perception of different living organisms and then developing algorithmic methods to apply them to neuro-inspired autonomous robots for land, air, and sea usage. The goal is to create autonomous vehicles (AVs) that can respond quickly to changes in the environment, similar to living organisms. 

This past May, Baillieul published a paper with his students Chiara Boretti, Philippe Bich and Yanyu Zhang, entitled Visual Navigation Using Sparse Optical Flow and Time-to-Transit”. This paper seeks to understand different visual cues, like binocular vision, and develop computer codes to utilize them. 

“In the case of binocular flow, there’s a parallax that the brain senses while integrating and registering the images from the left and right eyes,” says Baillieul. “Again, like with the bats, we’re trying to understand how these visual cues play together to give organisms reliable navigation.”

Recent hardware developments, like AVs, have increased research interests in optical flow for navigation. Researchers are currently working with new robots and hardware that will be able to experiment with more visual cues, like binocular vision, and explore how to integrate them. Human and animal brains are naturally wired to utilize these visual perceptions. For researchers trying to integrate robots with neuro-autonomous visual perceptions, they have to work on one motion primitive at a time. 

“Different visual perceptions are used depending on the setting, whether you’re walking through a field or down a hallway,” Baillieul explains. “What we’re trying to do is put things together in a coherent way so control software can switch seamlessly between visual  perceptual modalities as they traverse different environments.”

Baillieul’s work emphasizes the information aspect of control systems and how the brain integrates different sensory modalities from visual sensing to vestibular feedback to inform navigation. This is seen in his work with bats, which transition from navigating using eyesight to utilizing both visual perception and echolocation. As he delves further into this work, Baillieul aims to research how different regions of the brain are involved in this and the role that memory plays in animal movement. 

Baillieul has an extensive history of working with Department of Defense agencies. In 2007, Baillieul, working with Professor David Castañon, received a MURI grant from the Air Force for the “Behavioral Dynamics in the Cooperative Control of Mixed Human/Robotic Teams” project to study the dynamics of mixed teams, made up of both humans and robots. At the time, the Air Force was moving to replace one-third of its manned air vehicles with robotic UAVs, and understanding the relationships between humans and robots was critical to this goal.

“As machines became more intelligent, research began questioning how to utilize the intelligence in a way that is going to be naturally adopted by people,” says Baillieul. “Acceptance by users is extremely important and as we go forward the question has become how much autonomy should a vehicle have.”

John Baillieul is a Distinguished Professor of Engineering and co-founder of the Boston University Center for Information and Systems Engineering (CISE). He is a Professor of Mechanical Engineering, a Professor of Electrical and Computer Engineering, and a Professor of Systems Engineering. His focus is on robotics, the control of mechanical systems, and mathematical system theory. Professor Baillieul is a past Editor-in-Chief of the IEEE Transactions on Automatic Control. He is an IEEE Fellow for contributions to nonlinear control theory, robotics, and the control of complex mechanical systems. He is also a Fellow of SIAM and a Fellow of IFAC.