Robotics, artificial intelligence and machine learning might sound like cold, impersonal, or even abstract tools, but in the hands of a societal engineer, these tools can be wielded effectively to make real improvements in human health.

By Patrick L. Kennedy and Maureen Stanton

At the Boston University College of Engineering, researchers across biomedical engineering, electrical and computer engineering, and mechanical engineering are crossing disciplinary lines and combining their areas of expertise to develop robotic and AI technologies, allowing them to better see, study and solve health problems of all kinds. These projects range from the cellular level to the level of a human organ and all the way up to the systems level—encompassing the medical data of thousands of patients—and even beyond that, to the “systems of systems” level, examining and improving how the infrastructure in the places we live affects the health of the community.

Spotting cells over time

Associate Professor Mary Dunlop (BME) has spent the past few years working with machine learning methods to learn how to get more data out of bacterial studies than could ever be done by human researchers alone.

Let’s take one example—antibiotic resistance—where Dunlop is gaining traction. For reasons unclear, genetically identical bacteria in the same environment can develop minute differences in behavior. These tiny divergences carry big implications for their response to antibiotic treatment—one bacterium might be killed off, while another will survive by evading the antibiotic.

To find out why, researchers need to be able to image thousands of single cells over time. “The way we do this,” says Dunlop, “is by using time-lapse microscopy, where we grow bacteria under the microscope and image them over the course of hours to days while they are growing in a microfluidic chip.”

Frame-by-frame images of two genetically identical cell lineages over time. The cells in the images at bottom were killed off by the application of an antibiotic (highlighted by the red vertical line), while the cells in the images at top evaded the treatment and survived.

In the example pictured here, two genetically identical lineages of E. coli (top and bottom panels; horizontal axis shows time) react differently to an application of the antibiotic Ciprofloxacin (shown by the vertical red line). The cells in the lineage shown in the top panel survive, while the others die.

“This one happens to have more proteins related to drug resistance than the others,” says Dunlop. “The ones that survived were much more likely to be in the midst of expressing a particular gene we were interested in. The ones that died were not.” But, those are just two lineages out of the microfluidic chip’s full array of over 10,000 cell lineages.

“This is where we’ve started using machine learning,” says Dunlop. Her lab has developed an algorithm they call DeLTA, for deep learning for time-lapse analysis. The program can recognize the same cells from frame to frame of a video captured by microscopy.

“What we’ve done in our lab is figure out how to apply advanced computer vision techniques to images of growing cells,” Dunlop says. “This has massively increased our throughput and our ability to analyze this type of movie, so that now we can examine not just tens of cells but hundreds to thousands to even millions.”

That’s critical, Dunlop says, because the protein or gene expression variations that researchers are looking for are quite rare. By unlocking the mysteries of how and why some bacterial cells evade antibiotics, these studies will help researchers improve the treatments, making them more effective for more patients.

Mary Dunlop (BME)

Dunlop’s lab includes postdoctoral researchers trained in chemical engineering, biochemistry and cell biology, computational biology, theoretical computer science and quantum physics, among other disciplines. Her PhD students are in biomedical engineering as well as molecular biology, cell biology and biochemistry. The first publication of DeLTA started as a collaboration with a graduate student from the lab of Professor Ji-Xin Cheng (ECE, BME, MSE).

“The people in my group who are working on this have had to master skills in machine learning and AI in addition to working on microscopy studies and cloning, plus they need to know about antibiotic resistance,” says Dunlop. “Bringing all of these skills to bear on these problems has required just an incredible breadth of different tool sets and backgrounds to make the solutions possible.”

Steering from afar

Imagine standing outside a hedge maze and shoving a garden hose into it. Now, holding one end of the hose in your hands, you’re trying to maneuver the other end of the hose in order to hook an unseen ribbon that’s stuck on a little branch deep within the shrubbery. Oh, and even though you know roughly where the ribbon is, a strong gust of wind periodically blows through the bushes, moving all the branches out of place.

If you can see how the physics are against you here, then you understand the task facing pulmonologists as they seek a biopsy of a lung cancer nodule using a conventional bronchoscope, says Assistant Professor Sheila Russo (ME, MSE).

Sheila Russo (ME, MSE). Photo by Dana J. Quigley

“Meanwhile, people continue to die,” Russo says. “This is what keeps me and my colleagues motivated to come into the lab in the morning. We’re engineering soft robotic solutions to this societal problem in healthcare.”

Lung cancer is the deadliest form of cancer worldwide, partly due to the difficulty in catching the disease at its earliest stage, when it is most curable, says Russo. The lungs are a complex pair of organs, with the trachea branching out into a maze of smaller and smaller airways and ducts. Most cancerous lesions develop in those tiny ducts, way out in the periphery of the lung.

The location of a possible tumor nodule might be identified by a CT scan, but in order to extract a biopsy, a clinician needs to thread a bronchoscope into the lung’s periphery, find the tumor and puncture it, as in the Sisyphean scenario with the hose conjured above. Meanwhile, the patient’s breathing motion makes the nodule a moving target. As a result, lung tissue might be punctured in the wrong place, and patients will go home unclear that they even have a tumor.

Russo has a solution. With an interdisciplinary team combining expertise in mechanical engineering, materials science and engineering, biomedical engineering, and clinical medicine, she has developed a soft-robotic-assisted bronchoscope. With remote control, a pulmonologist can steer the business end of the scope and deploy the needle with accuracy from outside the patient. The innovation recently earned Russo’s lab a prestigious National Institutes of Health (NIH) Research Project Grant (also known as an R01).

Made of soft materials, Russo’s bronchoscope can rotate more than 180 degrees and curl back on itself, allowing for more nimble navigation in the lung’s periphery.

Whereas a traditional bronchoscope is six millimeters in diameter and has only 120 degrees of rotation, the Russo team’s robotic scope has a diameter of 2.4 millimeters and can rotate more than 180 degrees. Embedded with a camera as well as a needle, the device moves by means of three independent fluid pressure-driven actuators and uses a kind of airbag to stabilize itself in the moving airways, a bit like a subway rider grabbing a handrail. The device is made of soft materials that allow the tip to curl back onto itself.

“This allows you to navigate into very complex configurations within the lungs in very deep locations,” says Russo. “It’s a more precise, more accurate approach.”

Russo and her collaborators were able to miniaturize the scope by using new materials and even new fabrication processes to produce the robot. “We had to start from scratch and develop novel manufacturing technologies that enable us to scale these devices down as much as possible,” she explains.

Key to the development of the device has been the input of pulmonology experts such as Assistant Professor Ehab Billatos of the BU Chobanian & Avedisian School of Medicine.

“For us as robotics-assistant engineers,” says Russo, “we definitely want to have clinicians on the team who can tell us, ‘On a daily basis, this is my struggle. Can you engineer a robotic solution that can make my job easier, and at the end of the day, improve the health of my patients?’ So, we’re working at the interface between pulmonology, manufacturing, mechanical engineering, electrical engineering, controls, and software engineering. There’s really a variety of different skills that have to come together to be able to successfully work on this research.”

Ultimately, as the team further develops the technology with the NIH R01 grant, Russo hopes to simplify the control of the device with semiautomatic navigation, making it as easy as possible to train surgeons in its use.

“And that means you can bring good quality healthcare to remote areas,” says Russo. “We’re lucky here in Boston—we have the best hospitals. But I come from a very small rural town, and probably the best hospital my family and friends have is about two hours away. Imagine putting technology like this into the hands of clinicians there and in other areas in the country and the world, so everyone can have access to that level of high-quality healthcare that we all need.”

AI to tailor meds

For the nearly half of Americans with hypertension, it’s a potential death sentence—close to 700,000 deaths in 2021 were caused by high blood pressure, according to the US Centers for Disease Control and Prevention. It also increases the risk of stroke and chronic heart failure. And if it’s not caught early, it can be tough to treat. Although physicians have a bevy of potential hypertension medications to choose from, each is littered with pros and cons, making prescribing the most effective one a challenge: beta-blockers slow the heart, but can cause asthma; ACE inhibitors relax blood vessels, but can lead to a hacking cough. Now, a new artificial intelligence program might help doctors better match the right medicines to the right patients.

The data-driven model, developed by Distinguished Professor of Engineering Yannis Paschalidis (ECE, BME, SE) in collaboration with BU data scientists and physicians, as well as biomedical and electrical and computer engineers, aims to give clinicians real-time hypertension treatment recommendations based on patient-specific characteristics, including demographics, vital signs, past medical history and clinical test records. The model has the potential to help reduce systolic blood pressure—measured when the heart is beating rather than resting—more effectively than the current standard of care.

Yannis Paschalidis (ECE, BME, SE). Photo by Dana J. Quigley

“This is a new machine learning algorithm leveraging information in electronic health records and showcasing the power of AI in healthcare,” says Paschalidis. “Our data-driven model is not just predicting an outcome, it is suggesting the most appropriate medication to use for each patient.”

Currently, when choosing which medication to prescribe a patient, a doctor considers the patient’s history, treatment goals, and the benefits and risks associated with specific medicines. Still, selecting the right drug can be a bit of a coin toss.

By contrast, the BU-developed model generates a custom hypertension prescription, giving physicians a list of suggested medications with an associated probability of success.

“Our goal is to facilitate a personalization approach for hypertension treatment based on machine learning algorithms,” says Paschalidis, “seeking to maximize the effectiveness of hypertensive medications at the individual level.”

The model was developed using de-identified data from 42,752 hypertensive patients of Boston Medical Center (BMC), BU’s primary teaching hospital, collected between 2012 and 2020. Patients were sorted into affinity groups, such as demographics and medical history. During the study, the model’s effectiveness was compared to the current standard of care, as well as three other algorithms designed to predict appropriate treatment plans. The researchers found it achieved a 70.3 percent larger reduction in systolic blood pressure than standard of care. The algorithm was clinically validated, with the researchers manually reviewing a random sample of 350 cases.

The model also showed the benefits of reducing or stopping prescriptions for some patients taking multiple medications. According to the research team, because the algorithm provides physicians with several suggested optimal therapies, it could give valuable insights when the medical community is divided on the effectiveness of one drug versus another.

“These advanced predictive analytics have the ability to augment a clinician’s decision making and to have a positive impact on the quality of care we deliver, and therefore the outcomes for our patients,” says Rebecca Mishuris, who was previously an assistant professor at BU Chobanian & Avedisian School of Medicine and is now Mass General Brigham’s chief medical information officer. “This is an important first step that shows that these models actually perform better than standard of care and could help us be better doctors.”

“Using data from the diverse patient population of Boston Medical Center, this model provides the opportunity to tailor care for underrepresented populations, with individualized recommendations to improve outcomes for these patients,” says Nicholas J. Cordella, a BU Chobanian & Avedisian School of Medicine assistant professor and BMC medical director for quality and patient safety. “Personalized medicine and models like this are an opportunity to better serve populations that aren’t necessarily well represented in the national studies or weren’t taken into account when the guidelines were being made.”

Traffic lights smarten up

The south side of the BU Bridge features “one of the worst intersections in the entire universe,” charges Distinguished Professor of Engineering Christos Cassandras (ECE, SE), who navigates that crossroads with Commonwealth Avenue twice daily. Driving in general comes with myriad irritations—and, to Cassandras, opportunities for systems engineering to improve matters. “One of my frustrations,” he says, “especially toward the evening when traffic gets less heavy, is when you get stuck at a red light and you realize there isn’t even any traffic on the perpendicular street.” It’s especially maddening, Cassandras adds, when the technology exists to make commuting saner.

Christos Cassandras (ECE, SE). Photo by Dana J. Quigley

This is one reason Cassandras studies traffic problems and potential solutions. “My work focuses on large systems—sometimes referred to as ‘systems of systems’—with many dynamic agents,” says Cassandras. “I look at how to coordinate these agents so that they cooperate in order to meet specific system-wide objectives.”

In many such systems, these agents are competing for a common resource—a prime example being cars competing for space on the road. This thorny problem is more than an annoyance; traffic causes real harm. Across Boston, 32 people were killed in car accidents in 2021. And, all the congestion—at the BU Bridge and elsewhere—takes a toll on the wider world. Bumper-to-bumper back-ups are responsible for 20 percent of fuel consumption, along with the resulting carbon emissions.

To Cassandras, the answer lies in cooperation. His guiding principle here is that it is simply more efficient for agents to cooperate than to compete. “The secret to making any of these technological solutions work is to convince all of us that it’s better to behave in particular ways which achieve a social optimum, as opposed to a selfish optimum,” he says.

After years of studying traffic and devising smart parking and other solutions in Boston and other large US cities, Cassandras is now collaborating with BU computer scientists and industry and municipal partners on a project that, at first, seems more modest in scale: they’re focusing on a single intersection in a tiny Swedish village. If the team succeeds, that intersection will become ground zero for smarter traffic management practices that will sweep the globe.

The project is an initiative of the Red Hat Collaboratory at Boston University, a partnership between BU and Red Hat, an IBM subsidiary that is one of the world’s leading providers of open-source software.

A digital twin of Veberöd, Sweden, was created using 50,000 aerial photos and sensor data from all over town.

The village is Veberöd, Sweden. Its population of 5,000 is forward-looking and sustainability-minded, says Cassandras, and they’ve installed dozens of sensors around town to monitor air and water quality, among other data. Veberöd native and smart-city advocate Jan Malmgren has combined that data with 50,000 aerial photos of the village to create a digital twin of Veberöd. In this virtual village, the BU–Red Hat researchers can run simulation studies, testing tech solutions to real problems.

To start, the team is tackling the village’s central crossroads, where radar cameras monitor auto and foot traffic. Using the real-life data, Cassandras’ students “have done an incredible job simulating traffic,” wrote consultant Chris Tate in a Red Hat blog post.

Quickly realizing that the town’s existing light-change pattern was inefficient, Cassandras and team “put their advanced scientific and technical backgrounds to work to develop a new traffic pattern that optimized the traffic flow in all directions for both vehicles and people traffic,” Tate wrote.

Going forward, the village will implement the researchers’ virtually tested traffic-lights pattern in the actual intersection. “And then expand that to the entire village,” says Cassandras. “If you can do it for one traffic light, you can do it for many.”

Moreover, the platform that the team has developed is publicly available open-source software. That means if the project succeeds, then other researchers and municipalities the world over can easily duplicate that success, adapting the same principles to their own local contexts.

“The goal of the project is to develop open-source, smart city infrastructure, not just in Veberöd,” Cassandras explains. “The concept of open source is becoming more widespread because it promotes sharing and building off of each other’s ideas. We’re developing the platform so that once a solution is proven in Veberöd, it can be transferred to New York, Boston, or any other city in the world.”

In a way, the concept of a smart community based on cooperation “is nicely consistent with the convergence theme of the College of Engineering,” says Cassandras. The professor and his students bring systems and electrical and computer engineering expertise to problems that also require the input of computer scientists, traffic experts, urban planners, and even psychologists. “We learn from each other,” Cassandras says.

Whether it’s a busy road or an entire city—or a higher-ed institution moving beyond traditional department boundaries, “We all need to learn that cooperation always benefits us,” Cassandras says. “That realization is not always instantaneous, but in the long term, everyone wins.”

Additional reporting by Margo Stanton

Banner image graphic by Tulika Roy.

This story appears in the fall 2023 ENGineer magazine.