NSF Grant to Fund "Sleepy" Video Network Research

The National Science Foundation has awarded Professor Thomas Little, Associate Professor Janusz Konrad and Assistant Professor Prakash Ishwar, all of the ECE Department, a three-year grant to develop a low-power autonomous video sensor network that records a coastal ecosystem and wirelessly transmits significant information back to a base station for review.

The three-year, $450,000 project includes a demonstration using a 50-sensor network that will observe woodland animals at Boston University’s Sargent Camp and study shorebirds and grey seals at a University of Massachusetts Field Station in Nantucket, Mass. The technology is expected to enable the study of ecological phenomena that are otherwise difficult or impractical to monitor by human observers.

In recent years, a convergence in technology has allowed sensor networks to be miniaturized and manufactured at very low cost. The devices within the network communicate with each other wirelessly and relay pertinent data regarding monitored activity within the network. For example, a multiple-device sensor network could detect open parking spaces in a crowded parking lot and help drivers locate them.

Working with the grant, the ECE researchers will employ the same wireless technology but also add video-streaming capabilities. Because video streaming is both continuous and energy consuming, the biggest challenge facing the group is incorporating the video aspect while maintaining a low-power operation.

“Most sensor networks are designed to consume very little energy in order to run on batteries,” Little, the project’s principal investigator, said. “Video networks are at the opposite end of the spectrum. It only takes one bit to convey a car parked in a space. With video you get large quantities of data that need to be sourced and sent to someone to be watched.”

The solution, Little said, is to develop a sensor network that streams video only when needed.

“Adding video to a sensor network requires more complex and energy-consuming components,” Little said. “To preserve a low-power operation with video, the device will  ‘wake up’ periodically, take one picture of the environment and decide if there’s anything in that scene that may have changed. It nothing has changed, it goes back to sleep. But if something interesting is happening, we expect to support full motion stream across the network.”

The network’s ‘sleepy’ nature will eliminate the process of reviewing countless hours of unnecessary video.

“Suppose it’s watching the sea shore,” Little said. “The devices wake up and take a snapshot and nothing is happening. Then the fin of a great white shark has entered the picture. That’s an interesting event. Not only will it detect that event without interfering with the monitored the environment, as a human might, it will record the event on video.”

Because actions in a coastal environment are widely dispersed in space and time, predicting specific situations is virtually impossible. The event-driven video stream method is the most feasible observational solution, according to Little.

“The scenarios we’re contemplating involve observations occurring over a time span of months or years, or are the types of events that occur relatively infrequently and might occur across a broad coastal region,” he said. “We’re looking for those rare events that fit some criteria and trigger the camera to go into video mode. The consumption of energy is then justified in the overall mission of the system.”

The group is in the process of building a battery-powered prototype using an ECE-made camera and off-the-shelf components.

“The goal is to enable the creation of readily deployable camera networks,” Little said. “We will be able to interconnect all of the cameras. If one camera believes it’s looking at a shark off of the coast, it can collaborate with other cameras that have a different field of view. This will increase the confidence of successfully classifying objects, in this case as sharks.”