Building a robot that’s smarter than any other ever created sounds like a big goal, but that’s the purpose behind work by , a research assistant professor at Boston University, and , an assistant professor in the Department of Electrical & Computer Engineering (ECE).
Currently, the learning ability of software programs and robots is limited by their programming and the need to think in advance about any possible scenarios the robots might encounter. In a recent , Versace dubbed this “special-purpose intelligence” and said that his group is shooting for something more advanced.
In summer 2010, Versace launched the as part of the National Science Foundation-funded . The goal of the research group is to design a new type of computer that can sense, learn, and adapt – just like a living brain.
The Neuromorphics Lab’s main model, called (Modular Neural Exploring Traveling Agent), is a large-scale software simulation of a biological brain designed by a team led by Professor (GRS ’05), a CAS research assistant professor. MoNETA is able to learn from experience rather than being programmed to react to its environment.
“We want to eliminate, as much as possible, human intervention in deciding what the robot does,” Versace told BU Today. To be useful in robotic applications, MoNETA needs a “brain,” or a computing substrate, to support the system.
Joshi and Schuyler Eldridge (ECE ’09, ECE ’15) are designing the low-power custom chips that implement the neural algorithms in MoNETA.
“Building this hardware is rewarding but challenging,” said Joshi. “It’s tough to get all of the algorithms we need on one chip.”
Before Joshi started collaborating on the project just under two years ago, he had been interested in seeing how his ECE research could be linked to biology but wasn’t sure what direction to take. Then, Franco Cerrina, who served as the ECE Department Chair until he passed away in 2010, introduced him to Versace.
“What intrigued me most about this project was that the algorithms had a learning capability,” Joshi said. “This is a really interesting component when you think about how it can be applied to hardware.”
Truly a shared project, there are many other researchers involved, too. Gorchetchnikov is producing algorithms that create lifelike behavior without specifically telling a robot what to do. Others are looking at brain function. Postdocs Gennady Livitz (GRS ’11), Jesse Palma (GRS ’11), and Research Associate Aisha Sohail are working on the visual systems, while Professor leads the group effort in technology outreach and commercialization via the
“We are a bridge between neuroscience and engineering,” Versace told BU Today. “We are fluent in both languages. We can talk neurotransmitters and molecules with biologists and electronics and transistors with engineers.”
Researchers on the project are from a wide range of backgrounds, including neuroscience, psychology, biology, computer science, engineering, and math.