Imaging Method Promises to Upgrade Remote Sensing and Microscopy

By Mark Dwortzan

Emulating a conventional LIDAR system, Assistant Professor Vivek Goyal's (ECE) team used pulses from a focused laser source to illuminate one scene patch at a time. (Image by Dheera Venkatraman, MIT Research Laboratory of Electronics)
Emulating a conventional LIDAR system, Assistant Professor Vivek Goyal's (ECE) team used pulses from a focused laser source to illuminate one scene patch at a time. (Image by Dheera Venkatraman, MIT Research Laboratory of Electronics)

Imagine two hiring managers sizing up an applicant. The first gathers all the information she can before forming a first impression. The second collects the bare minimum but does so strategically, arriving at virtually the same impression with far less effort and in far less time.

It turns out that the latter approach can be taken to produce reasonably accurate photos of objects under low lighting conditions using a remote sensing technology such as LIDAR, which bounces pulsed laser light off of a targeted object to form an image. Rather than waiting to collect and compare hundreds of reflected photons to generate each pixel of the image, as is typically done, you can instead count the number of laser pulses it takes to detect the first photon at each pixel. The lower the number, the greater the intensity of the light reflected off the object’s surface—and thus, the brighter the pixel.

Assistant Professor Vivek Goyal (ECE), who joined the College of Engineering faculty in January, and who, along with former colleagues at MIT’s Research Laboratory of Electronics, demonstrated the concept in a recent issue of the journal Science, calls his method “first-photon imaging.”   

“The project started out as a thought experiment,” said Goyal, whose research was funded by the Defense Advanced Research Projects Agency’s (DARPA) Information in a Photon Program, and the National Science Foundation. “We wondered what we could infer about a scene from detecting only one photon from each pixel location, and eventually realized that when the intensity of light is very low, the amount of time until you detect the photon gives you information about the intensity of the light at each pixel.”

First-photon imaging may ultimately improve night vision and low-light remote sensing technologies by extending the distance at which images may be taken. The new method may also dramatically increase the speed of biological imaging and the variety of samples—many of which degrade when subjected to higher-intensity lighting—that can be photographed.

To produce a high-quality image from the raw, single-photon-per-pixel data, Goyal’s method applies a computer model of surfaces and edges typically encountered in three-dimensional, real-world objects, correcting the intensity and depth of neighboring pixels as needed to fit the model; and filters out noise coming from ambient light sources. 

While many researchers are pursuing new techniques to boost remote sensing and microscopy capabilities, most focus on building more effective detectors. Goyal is working to significantly enhance existing detectors by incorporating accurate physical models in signal processing, and to further explore the potential impact of first-photon imaging on remote sensing and microscopy.

 

Back to news headlines