Mobile robots, which have the ability to move around their environment, are used by everything from NASA to consumers – just look at the Mars Pathfinder’s rover, Sojourner to Mars, or the popular floor cleaner, Roomba.
Autonomous navigation capability can be built in mobile robots using neuromophic algorithms. These algorithms are typically coded in software and run on generic hardware in the robot. However, this approach can have high overhead when it comes to power and performance.
Boston University’s Assistant Professor Ajay Joshi (ECE) and Senior Research Scientist Massimiliano Versace (CAS) believe that this can all be changed if there was a purely hardware-based customized design option available.
“This solution would allow us to seamlessly close the loop between model and behavior in mobile robots working in real-time environments,” Joshi and Versace wrote in their project description for “Plastic Neuromorphic Hardware for Autonomous Navigation in Mobile Robots.”
The National Science Foundation’s Center of Excellence for Learning in Education, Science and Technology (NSF-CELEST) is interested in Joshi’s and Versace’s work and recently awarded them $97,417 for the project.
“I’m very excited about moving this project forward,” said Joshi. “This will be a good opportunity to establish a collaboration between ENG and CAS at BU.”
Joshi and Versace will work through the Boston University Neuromorphics Laboratory and the Integrated Circuits and Systems Group as they develop both hardware and neural algorithms. They hope to close the gap between neural models and their applications in mobile robotic platforms.
NASA has already expressed interest in this project, and its findings could also be applied by other organizations such as the Office of Naval Research and the Air Force Office of Scientific Research.
-Rachel Harrington (email@example.com)