By Jessica Colarossi and Patrick L. Kennedy

We’re beyond the light bulb here. At the BU College of Engineering, photonics and optical systems researchers are collaborating to advance the practical uses of light to tackle global challenges. Working with students and with colleagues in other schools and institutions as well as in industry and government, they’ve developed innovations such a space telescope—BU’s first device to land on the moon—aimed at understanding our magnetosphere; a medical device that uses light to monitor blood pressure and track cancer; and a novel 3D imaging technology for autonomous navigation by night. And one BU ENG researcher—known for demonstrating how tornado-shaped light beams might make for a greener Internet—has taken the reins of the leading journal in the optics field. Hailing from biomedical engineering, electrical and computer engineering, and mechanical engineering, these researchers are lighting the way forward.

Using light to track blood pressure and tumor treatments

If you light up the tip of your finger with a flashlight, you’ll see the phenomenon called diffusive glow. That’s what happens when all the cells and molecules that make up your finger absorb and scatter the steady beam of light in an instant.

“The light changes direction millions of times so that it turns into a diffuse red glow,” explains Associate Professor Darren Roblyer (BME). Understanding how light interacts with living cells and tissues is the foundation of his work.

Darren Roblyer (BME). Photo by Jackie Ricciardi
Darren Roblyer (BME). Photo by Jackie Ricciardi

Roblyer and his team are testing ways to monitor biological processes—like blood pressure, oxygen levels, and disease progression—with light waves. For example, studying the way different wavelengths create patterns when absorbed and scattered can tell Roblyer about the metabolic signals in a person’s blood. Over the past several years, he’s developed a blood pressure monitoring device that does not involve a cuff squeezing your arm, with the aim of getting a more accurate reading than the current, sometimes uncomfortable, options.

“This technology measures the optical effects of what happens when your heart beats,” says Roblyer, who’s also a member of the BU Photonics Center. Each time your heart beats, blood flow speeds up and then slows down, and, at the same time, arteries expand and contract, increasing and decreasing the volume of blood in the arteries. “We’re measuring both of those things, and we are extracting a whole lot of information from those waveforms and then using that to predict blood pressure.”

The technology, called speckle contrast optical spectroscopy, uses multiple wavelengths, from visible to near-infrared light (NIR), which is just past what our eyes can see, to monitor blood pressure. The device clips over the finger and straps around the wrist. In a recent study, the team found that the device took highly accurate, continuous blood pressure measurements on 30 individuals successfully over several weeks.

Roblyer is also testing a similar type of optical technology—measuring the absorption and scattering of light waves—for reading metabolic signals of cancer cells. He has been working with Naomi Ko, a BU Chobanian & Avedisian School of Medicine associate professor of medicine and a medical oncologist at Boston Medical Center (BMC), on developing a new tool for monitoring how well breast cancer tumors respond to chemotherapy or radiation treatment.

The metrics that their device measures—like the concentration and ratio of oxygenated and deoxygenated red blood cells—can be used to predict whether or not a tumor is likely to shrink. Ko and Roblyer have been testing the device in clinical settings, and plan to continue analyzing its effectiveness. Eventually, Roblyer wants the device to be smaller and transportable, so that patients can use it at home, and send the readings to their doctors without needing to schedule an appointment.

“This research is driven by collaboration,” says Roblyer. “We’ve assembled a multidisciplinary team—including engineers, physicists, physicians, nurses, hospital administrators, business and regulatory specialists, manufacturing experts, students—PhD, master’s, and undergraduate—as well as volunteers and patients. Each of these perspectives is essential for developing the technology and implementing it into the standard-of-care.”

“One of the most important things I think I do is, as we’re developing these technologies, we’re talking to a lot of physicians, understanding what their unmet needs are, and helping to understand whether our technologies could help,” Roblyer adds. “My hope for this work is to make a real impact in the lives of patients.”

From the moon, a BU-built telescope shows us solar wind

On March 2, after traveling 238,855 miles from Cape Canaveral, a shiny, golden spacecraft touched down on the moon. Among the 10 aerospace instruments carried by the autonomous moon lander, a telescope pointed back at our home planet. The Lunar Environment heliospheric X-ray Imager (LEXI) was designed and built by Associate Professor Brian Walsh (ME, ECE) and his colleagues. It is the first BU-created device to ever land on another planetary body, and it captured the first-ever images of the boundary of Earth’s magnetic field.

Part of NASA’s Blue Ghost Mission 1, LEXI has given Earthlings an unprecedented view of our magnetosphere, the magnetic bubble that shields us from harmful radiation, deflecting the constant flow of solar wind and high-speed charged particles emanating from the sun.

Brian Walsh (ME, ECE)
Brian Walsh (ME, ECE)

Walsh began measuring X-ray signals in the atmosphere as a postdoctoral researcher at NASA’s Goddard Space Flight Center in 2009. His team at BU received funding from NASA to develop the LEXI telescope in 2019. In the years since, the team worked hard choosing materials, doing the math to determine the ideal dimensions, adding electronics and computing systems, crafting specially engineered glass lenses, and testing for durability. The 24-pound telescope needed to withstand intense vibrations and temperature swings, and communicate seamlessly to the lab’s control room. The team collaborated with researchers from NASA Goddard, Johns Hopkins University, University of Miami, and the University of Leicester.

The device’s innovative optical lenses mimic lobster eyes—technology prototyped in the 1990s that was inspired by the way lobsters can see in dark, murky environments—that pick up even the faintest glowing X-ray signals, called soft X-rays. The crustacean-inspired lenses in LEXI were specially fitted to withstand space flight.

After the team completed LEXI and successfully tested it, they transported the device by truck to Firefly Aerospace’s headquarters in Austin, Texas, where it was installed in the Blue Ghost lander. After the launch and landing, Walsh and his students stayed connected to the telescope through the lander’s computer systems, receiving the X-ray signals that helped paint a picture of the boundary of Earth’s magnetic field.

Those X-rays are released when a charged atom emitted from the sun, like an oxygen ion, slams into a neutral particle, like hydrogen, which floats around in abundance at Earth’s outer atmosphere. When the particles collide, the oxygen ion steals an electron from the hydrogen, and that process releases an X-ray. LEXI recorded those invisible wavelengths of light, constantly present around our planet, for seven days. After that, the sun set on the moon. It is presumed that the icy temperatures—dipping as low as –208 degrees Fahrenheit—then disabled the lander and all of its payloads permanently.

In that short window, LEXI transmitted data that will help answer “big outstanding questions,” Walsh says, like whether we can predict when and how Earth receives solar energy in the amounts that cause geomagnetic storms.

“We live in this bubble, this magnetosphere,” says Walsh. “Some days, a lot of energy breaks into that magnetic bubble. We’re trying to understand how that process works.”

Using the light we can’t see

When you look out your window at night, you expect to see objects—a tree, a neighbor’s house—illuminated by street lamps or moonlight. If there were a power outage on a moonless night, you’d see only darkness.

That doesn’t mean there’s no light out there, though. “There is light,” says Professor Vivek Goyal (ECE). “It’s just at wavelengths that you can’t see with the naked eye.”

Vivek Goyal in his office
Vivek Goyal (ECE)

With the aid of an ordinary thermal camera or night vision goggles, you could see something—at least the outlines of nearby objects. But Goyal says a much richer sense of the surroundings can be gleaned from that invisible-to-us light, and his team is developing the more sophisticated data processing needed to do it. Someday, their 3D imaging technology might be used for mapping and navigation for autonomous vehicles, among other applications.

“You can infer distance,” says Goyal. “The atmosphere is not only absorbing light but also emitting light, as a function of wavelength, and we can mathematically model that. There’s different absorption at different wavelengths as light travels through the air, so light that’s traveled a longer distance has a different spectrum than light that was emitted very close to you.”

Goyal and colleagues have begun successfully picking up distance cues by passively measuring thermal radiation at these various wavelengths that are too long for the naked eye. Their sensor technology is passive in the sense that it detects light, but doesn’t emit light.

For Goyal, the work of traditional thermal imaging is almost “too easy,” he says. “A lot of the prior work was related to the Air Force, where they studied tracking a missile or an airplane—something much hotter than the atmosphere. We want to be able to use this absorption principle to do ranging [determining distances] for scenes where the objects are not necessarily hotter than the air at all—in fact, the objects could be colder than the air.”

“We separate out the effects of material and temperature,” Goyal says. “So if an autonomous vehicle is navigating at night, and an obstacle is just about the same temperature as the road, it would look the same to an ordinary thermal camera, whereas our sensor would discern the difference and be able to navigate around it.”

The students and postdocs in Goyal’s lab hail from disciplines including computer science, materials science, electrical engineering, and computer engineering, and his colleagues include researchers at MIT, the National Institute of Standards and Technology, and the Jet Propulsion Laboratory.

“Research is so social,” says Goyal. “A lot of it has to do with connecting with people with the same interests.”