Can Artificially Intelligent Robots Revolutionize Recycling?
Researchers at BU are designing a robot to make sorting decisions in recycling facilities
Imagine a future where robots can sort recyclables from the waste that clogs up recycling facilities across the country, like glass bottles from dirty food containers, or old newspapers from flimsy plastic wrappers. It’s the stuff that, right now, only human eyes and hands can go through and separate true garbage from what’s recyclable, but the job exposes people to all the potential health hazards that come from handling garbage and its toxins. Meanwhile, the ocean is pretty much drowning in plastic pollution, and recycling in the US is becoming more and more costly. Scientists and sustainability experts have begun to notice the extreme need for an industry transformation, and artificially intelligent robots on the recycling sorting line—although they would replace human jobs—could be the answer.
“I think the problems the recycling industry is facing affect humanity in ways that are equally as important as the jobs people have,” says Kate Saenko, a Boston University College of Arts & Sciences associate professor of computer science and director of the Computer Vision and Learning Group. “I always think of the Pixar movie WALL-E, the extreme case where humans just didn’t figure things out until it was too late and it was left to the robots…. We don’t want to reach that point,” she says, half joking, half totally serious.
For the next four years, Saenko will be on a team designing a recycling robot that will automatically sort and identify items passing through recycling facilities with the hopes of creating a more efficient, profitable system. The research team received over $2.5 million in funding from the National Science Foundation’s Future of Work at the Human-Technology Frontier to get started on the project this year.
Currently, thousands of workers across the country are filling the role of manually sorting waste materials, but the researchers’ intention isn’t to leave those people without new jobs to be done. The team, including collaborators from Yale and Worcester Polytechnic Institute, will be evaluating opportunities to create new, safer jobs that can complement a soon-to-be-automated recycling system.
I think the problems the recycling industry is facing affect humanity in ways that are equally as important as the jobs people have.
It’s estimated that a single American throws out 7 pounds of materials every day—equivalent to about 2,555 pounds every year, per person. That’s a ton of trash (actually that amount adds up to literally one and a quarter tons). And much of it could be recycled rather than end up in landfills that contribute a large portion of climate change–inducing greenhouse gas emissions. With the recycling industry representing more than 530,000 jobs in the US alone, there is a lot at stake for an industry, currently under economic stress, that has the ability to heavily curb our emissions.
How will the robot actually work? Think of the system as a robotic arm, Saenko explains. She and co–principal investigator Vitaly Ablavsky, senior research scientist in BU’s Image and Video Computing lab, are mainly responsible for building the “eye” of the robot, an area referred to as computer vision. This involves designing the algorithm that interprets the information collected by the visual sensors on the robot so it can understand and sense the different items moving along the conveyor belt. Then comes the rest: figuring out what kind of gripper will work best, if the arm should have a suction function, what kind of material to use, and so on.
The group is partnering with the Casella material recycling facility (MRF) in Worcester, Mass., where Saenko, Ablavsky, BU PhD students Dina Bashkirova and Sid Mysore, and the rest of the team recently visited to start the tedious task of data collection in order to build the algorithm. There is currently no data available for this purpose, since research in computer vision is mostly motivated by emerging applications like autonomous driving and facial recognition, not recycling. So the team is starting from scratch.
“We have good geometric models for the shape of people and automobiles. On the other hand, we don’t have good statistical or shape models for trash and recyclables, especially after they have been deformed, torn, stained,” explains Ablavsky, who will be working on the project remotely as he begins a new research position in the Applied Physics Laboratory at the University of Washington, Seattle, later this year.
He says that visiting the MRF was an eye-opening experience, and it was impressive to see how fast the workers perform their tasks, given the volume and speed with which the materials move on various conveyor belts. “The work shifts are nine hours, so it’s a hard job, and one hopes it can be made a little easier thanks to this effort,” he says. “And from the computer vision standpoint, this is a hard problem. We think that by working together with the team, we can make the problem manageable.”
Given the challenges that await, they plan to make their data publicly available for other researchers to build and improve upon. They hope to have a prototype within the next two to three years, and then spend the last year evaluating the system.