Cars that learn how to drive themselves by watching other cars

By Gina Mantica for Hariri Institute for Computing and Computational Science and Engineering

Self-driving cars are powered by machine learning algorithms that require vast amounts of driving data in order to function safely. But if self-driving cars could learn to drive in the same way that babies learn to walk – by watching others around them and trying to mimic certain movements – they would require far less data. Hariri Institute Junior Faculty Fellow Eshed Ohn-Bar, an assistant professor of electrical and computer engineering, developed an efficient, safe, and collaborative paradigm for watching and predicting other cars’ actions to train autonomous vehicles.

Ohn-Bar and Jimuyang Zhang, a PhD student in electrical and computer engineering, presented their findings recently at the 2021 Conference on Computer Vision and Pattern Recognition (CVPR 2021).

Eshed Ohn-Bar
Hariri Institute Junior Faculty Fellow Eshed Ohn-Bar developed an efficient, safe, and collaborative paradigm for watching and predicting other cars’ actions to train autonomous vehicles.

The idea for the team’s training paradigm came from a desire to increase data-sharing and cooperation among researchers. Autonomous vehicles require many hours of driving data to learn how to drive safely, and some of the world’s largest car companies keep their vast data private to prevent competition. “There are a lot of autonomous driving companies and each company goes through the same process of taking cars, putting sensors on them, paying drivers to drive the vehicles, collecting data, and teaching the cars to drive,” said Ohn-Bar. Sharing driving data could help companies create safe autonomous vehicles faster, allowing everyone in society to benefit from the cooperation. Ohn-Bar highlights that no one company can solve this problem today on their own because current AI systems require so much data to work well. “Billions of miles are just a drop in an ocean of real-world events and diversity. Yet, a missing data sample could lead to unsafe behavior and a potential crash, and billions of miles are just a drop in an ocean of real-world events and diversity,” Ohn-Bar said.

The researchers’ proposed machine learning algorithm leverages data from other cars. The algorithm estimates the viewpoints and blind spots of other cars in the area to create a map that has a bird’s eye view of the surrounding environment. These maps help self-driving cars detect obstacles, like other cars or pedestrians, and can be used to understand how the other cars turn, negotiate, and yield without crashing into anything. A self-driving neural network is then trained by translating the actions of surrounding vehicles into the autonomous vehicle’s own frame of reference. These other cars may be human-driven vehicles without any sensors, or another company’s auto-piloted vehicles. Since observations from all of the surrounding cars in a scene are central to the algorithm’s training, this “learning by watching” paradigm encourages data sharing, and consequently safer autonomous vehicles.

Ohn-Bar and Zhang tested their self-driving cars in two virtual towns — one with straightforward turns and obstacles similar to their training environment, and another with unexpected twists like five-way intersections. In both scenarios, the researchers found that their self-driving neural network gets into very few accidents. With just one hour of driving data to train the machine learning algorithm, the autonomous vehicles arrived safely at their destinations 92 percent of the time. “While previous best methods required hours, we were surprised that our method could learn to drive safely with just ten minutes of driving data,” said Ohn-Bar.

ECE Eshed Ohn-Bar gif_lbw
The researchers’ self-driving neural network gets into very few accidents in their virtual town.

While these results are promising, Ohn-Bar said that there are still several open challenges in dealing with intricate urban settings. “Accounting for drastically varying perspectives across the watched vehicles, noise and occlusion in sensor measurements, and various drivers is very difficult,” said Ohn-Bar.

The team’s paradigm for teaching autonomous vehicles how to self-drive could be used in other technologies as well. “This machine learning paradigm can be applied to many other systems,” said Ohn-Bar, “Delivery robots or even drones could all learn by watching other AI systems in their environment.”