Lighting the Way Forward for Autonomous Vehicles
CISE Faculty Affiliate Ajay Joshi with collaborators at Lightmatter and Harvard University receive $4.8M IARPA grant to develop a new Electro-Photonic Computing (EPiC) system for AI-based navigation in Autonomous Vehicles
Anyone who has ever been behind the wheel of a car knows that response time is crucial. The human sensory system needs to be fully engaged in order to not only direct the motion of a vehicle, but to respond instantly to changing conditions and potential hazards in the road and surrounding environment. All that focused attention and responsiveness can take a lot out of a person, as you’ll know if you’ve ever found yourself exhausted after a long drive, or one that involved navigating poor weather conditions. Bostonians attempting to get around the city after a nor’easter can certainly relate.
The same goes when the driver is not a human being, but a computer; that is, processing all that sensory data takes a huge amount of energy, but at the same time, it must be accomplished almost instantaneously. When your phone or dashboard GPS lags, you might face the inconvenience of needing to adjust your route to take a different exit, but the consequences to similar lag time are much higher when it comes to avoiding a potential collision with another vehicle or a pedestrian.
Autonomous Vehicles (AVs)–more popularly known as self-driving cars—are projected to become pervasive in the next decade, but technological hurdles stand between that projection and reality. In order to operate safely, AV systems must rely on the combined data from a variety of sensors, including RADAR, LIDAR, cameras, and various other driver assistance devices. In order to process such a vast array of data to direct the vehicle, trillions of calculations must be performed per second. The transistor-based computers used by existing AV systems consume a significant amount of power, limiting vehicle range. And yet, in order to achieve the operational and safety standards that will be necessary for our autonomously-driven future, more sensors–and more processing power–are needed. The cost in energy of such additions cannot be supported by the systems that are currently available.
To address this key issue, researchers at Boston University, Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), and Lightmatter have teamed up to develop a new hybrid system that can answer the triple challenge of processing capacity, low latency and energy efficiency: an Electro-Photonic Computing (EPiC) system, building on recent achievements in the development of photonic computer chips – chips which compute with light. Photonic chips use much less energy than traditional electronic computer chips, and can compute on the order of tera operations per second.
Backed by a $4.8M IARPA (Intelligence Advanced Research Projects Activity) grant under the MicroE4AI (Microelectronics in Support of Artificial Intelligence) program, this multi-institutional team proposes to develop a new EPiC AV system which leverages the strengths of both photonics and electronics; using the former to perform large matrix-vector computations, while electronic computing is used for non-linear operations and for storage. The system will be fully integrated with the AV sensors, able to meet perception, mapping and planning needs while overcoming the power and performance limitations of current, electronics-only AV systems.
Professor Ajay Joshi of ECE will lead the Boston University team, in collaboration with Lightmatter team leader Dr. Darius Bunandar, and Harvard University SEAS Professor Vijay Janapa Reddi. The goal of the project is to build a working EPiC AI system, which will be fully installed and used to autonomously drive a buggy–likely just the first of a generation of self-driving cars which perform much of their “thinking” using light, and are much safer and more efficient for doing so.