Take One App and Text Me in the Morning

People with Parkinson’s are walking faster, athletes are recovering more quickly from injuries, and people with spinal cord damage may soon have new ways to communicate. Sargent’s professors, researchers, alums, and students are using technology to develop innovative solutions for people with disabilities. Here are three projects that will benefit patients, health providers, and caregivers.

A PT app for Parkinson’s

It started in Denise’s thumb—a slight tremor when she gestured while talking. Just part of getting older, she thought. Or maybe it’s related to that shoulder problem I’m working out in rehab. Her rehabilitation therapist disagreed. “Go see your doctor,” she said.

Stephen’s handwriting tipped him off. It was getting smaller. Then there was that shaking in his left arm. His mother had experienced similar symptoms, and he knew what the doctor would say.

Denise and Stephen (last names withheld for privacy), both in their early 70s, are among an estimated 7 to 10 million people worldwide who have Parkinson’s, an incurable brain disorder that affects the nervous system, causing tremors, slow movement, stiffness, and impaired balance. Terry Ellis (MED’05), an assistant professor of physical therapy and athletic training and the director of Sargent’s Center for Neurorehabilitation, is working to help patients with Parkinson’s like Stephen and Denise manage their disease through exercise.

Ellis’s research has shown that exercise can help patients improve their walking ability, strength, and flexibility, and may even slow the disease’s progression. But patients with Parkinson’s aren’t often referred to a physical therapist until years after their diagnosis, when function has begun to decline, Ellis says. Finding someone well versed in the disorder is difficult, especially in more rural areas, and patients’ engagement in exercise typically declines once therapy is over. Ellis and her colleague Nancy Latham, a research assistant professor in the Health & Disability Research Institute at the School of Public Health, hope that keeping patients in touch with physical therapists through mobile health (mHealth) technology like smartphones and iPads will help.

Patients appreciate the ongoing interaction mHealth technology offers.
Terry Ellis (MED’05), director of Sargent’s Center for Neurorehabilitation, is conducting a pilot study featuring application software designed to help patients with Parkinson’s stick to their exercise plans. Photo (left) by Kelly Davidson Savage; Wellpepper screenshot (right) courtesy of Terry Ellis

“Especially with the explosion of aging populations, we’re going to have more and more people with these chronic diseases,” Ellis says. “So how are we going to help them maintain a high-quality life and the highest degree of function, and to be independent and age at home? I think physical therapy has a large role, but we need to think of new models of care.”

In fall 2013, with a $50,000 grant from the American Parkinson Disease Association, Ellis and Latham began recruiting for a pilot study featuring Wellpepper, application software designed to help patients stick to treatment plans. The participants, New England–area patients with Parkinson’s, are randomized into two groups: an mHealth group that uses Wellpepper on an iPad mini provided by Sargent and a control group that follows an exercise routine with the help of the traditional paper instructions and demonstrative photos. Participants in the mHealth group access personalized exercise videos—Sargent videotapes them performing prescribed exercises when they enter the program—and submit their daily progress and levels of difficulty and pain. They can also chat virtually with a Sargent physical therapist who receives their Wellpepper data and readings from pedometers linked to the app via Bluetooth wireless technology. Ellis chose the iPad mini based on focus group feedback, but ultimately would like to see the app available on any platform of the patient’s choice.

To participate in the program, Denise and Stephen traveled on separate occasions to the Center for Neurorehabilitation to meet with Tami DeAngelis (’02), a senior physical therapist, who guided them through several exercises and gave them a pedometer and a daily walking goal. “I hope to get fitter,” says Stephen, who’s just starting out in the paper group, “and I hope it slows down the progression of the disease.” Denise, who has finished her six months in the mHealth group, says she is “willing to try anything, just so I don’t fall through the cracks.”

Patients appreciate the ongoing interaction and accountability mHealth technology offers, Ellis says. “They want the encouragement and some level of oversight—someone saying, ‘Hey, great job! Look how much you accomplished!’” The encouragement motivates Denise, who says mHealth technology keeps her on track with her exercise routine. She likes the personal interaction, as well as working with DeAngelis to switch up her program when she wants more of a challenge. DeAngelis checks Wellpepper regularly and says mHealth technology makes her feel more connected to her patients and better able to support them.

“Especially with the explosion of aging populations, we’re going to have more and more people with these chronic diseases. So how are we going to help them maintain a high-quality life?”

—Terry Ellis

Data collection will wrap up in fall 2014, and preliminary study results are positive. When 18 participants had completed the first three months of the study, those using the iPad had a higher exercise adherence rate (81 percent) than those using paper (57 percent). They spent more time performing moderate-intensity exercise, reported more confidence in their ability to exercise successfully, and rated the program 9 out of 10 for satisfaction.

Sargent is ahead of the curve in experimenting with these technologies, which are examples of telehealth, the delivery of health services through electronic communications such as email, two-way video, and smartphones. As Ellis and Latham point out, telehealth’s possibilities are expanding as technology becomes a more integral and affordable part of people’s lives, and as health care professionals seek ways to counteract higher costs of in-person care and shorter lengths of stay in hospitals or rehabilitation facilities.

The Affordable Care Act, which includes provisions for telehealth, is giving the field a boost, says Karen Jacobs (’79), a clinical professor of occupational therapy. Sargent has already incorporated telehealth into its occupational therapy curriculum, she says, and “is well-positioned to be a global leader in student training and faculty research” in the field. Participants in Sargent’s new Neurological Physical Therapy Residency Program, for example, are involved in observation and research for the Wellpepper project.

Telehealth poses challenging questions for the health industry: How will services be reimbursed? Will current licensing policies change to facilitate care across state and national boundaries? What steps will providers take to ensure patients’ privacy and the security of their information? But Ellis says now is the time for change. “We have to be innovative in coming up with new models of care to try to reach people. I think we can have a bigger impact than people realize.”

Tech care for an NBA team

Damian Lillard needed to get his left ankle into shape. The 2013–2014 National Basketball Association (NBA) season had begun, and the Portland Trail Blazers’ point guard was recovering from an injury. He started practicing one-legged jumps with OptoGait, a technology whose optical sensors, placed on the floor on either side of the user, measure data on functions like gait, power, balance, and symmetry. “I could feel that my right leg was more powerful,” Lillard recalled in a Wall Street Journal article. “The OptoGait let me track that exercise to the point where I was jumping off both legs with equal power.”

OptoGait is new to the Blazers this season, just like the man who brought it to them: Chris Stackpole (’09, ’12), the team’s director of player health and performance. Hailed as a forward-thinking rising star, the 27-year-old was hired in June 2013 to help reverse the Blazers’ abysmal health record. In the five NBA seasons from 2008 to 2013, the Blazers ranked second-to-worst for missing games due to injury, reported Jeff Stotts at instreetclothes.com, a website that tracks sports injuries.

In less than one year, Stackpole, his colleagues, and the players have achieved an impressive turnaround. In January 2014, Stotts said the fact that the team was the only one in the NBA to use the same starting lineup in every game this season was largely due to the Blazers’ sports medicine professionals; in March, he reported that the Blazers had missed the fourth fewest games due to injury. Stackpole points to other signs of success: most of the players have had career years and the team won 21 more games than last year. He adds, “We’ve brought athletes back from injury faster than anticipated.”

Chris Stackpole (’09, ’12) brings the Portland Trail Blazers a technologically oriented model of care that includes using OptoGait to track players’ balance and power.
Chris Stackpole (’09, ’12) brings the Portland Trail Blazers a technologically oriented model of care that includes using OptoGait to track players’ balance and power. Photos by Jamie Francis/The Oregonian

Stackpole brings the Blazers a model of care that’s holistic, preventive, individualized, and technologically oriented. He considers every aspect of players’ well-being—physical, mental, and emotional—when assessing them and delivering care. While such a model may seem commonsensical, Stackpole, his colleagues, and team members describe it as a contrast to the reactionary, group-centered approach some health professionals use.

“We’ve taken the approach of, ‘What are each athlete’s specific needs every single day?’” says Stackpole’s colleague Todd Forcier, the Blazers’ sports performance coach. How tired a player is, how much time he spent on the court the night before, and what’s going on at home all factor into that assessment. There are few teams in the league that have models comparable to the Blazers’; some of these include the Chicago Bulls and Oklahoma City Thunder, where Stackpole interned. His Sargent training in multiple specialties helps guide him through relatively unexplored territory. “Not only being an athletic trainer, and not only being a physical therapist, I’ve learned how to integrate those two skill sets,” he says.

Using technology to assess and rehabilitate players is part of Stackpole’s arsenal. In addition to the OptoGait, the Blazers now use tools including heart rate monitors, accelerometers, GPS tracking devices, and iPads for filling out questionnaires about their health. Stackpole says sports teams in Europe and Australia have used these instruments for years; US basketball is just catching up.

Though “the NBA is becoming more advanced in how it uses technology to track athletes’ performance and recovery,” there’s no standard model in the league for how to use it yet, Stackpole says. “We’re identifying how it fits into our organization and how we can create a competitive advantage.”

“We’re using technology to track recovery and performance, so we can try to identify if and when an athlete starts to break down, or when they’re at their peak.”

—Chris Stackpole

Stackpole uses monitoring equipment with the team on a daily basis to keep “almost a live pulse” on players. “We’re using technology to track recovery and performance, so we can try to identify if and when an athlete starts to break down, or when they’re at their peak.” During practice, Stackpole tracks data from players’ heart rate monitors to determine who needs a break, and how to get them into optimal condition before the next game. He uses the OptoGait to establish each player’s baseline jumping ability and gait symmetry, and determine who might be at risk for injury. Stackpole then designs programs to decrease that risk. Healthy baselines become a reference point for players throughout the season and a goal to return to after injury. When guard C. J. McCollum broke his foot, for example, Stackpole used a combination of rehab approaches to avoid surgery. When he got back on the court, “It looked like he had lost no time.”

A holistic, technologically oriented model of care only works if players buy into the program, Stackpole says. He recalls how, when he first introduced heart rate monitors, players “would rip them off in the first 15 minutes” of practice. He had to build trust with players and medical performance staff to “establish a new culture of how athletes train and develop” and explain how these methods would help them. Now, says Forcier, players volunteer information on diet, sleep habits, and the benefits they get from prescribed exercise.

The Blazers’ new and constant awareness of their bodies helps each player maintain his health, center-forward Joel Freeland told the Oregonian. “Whereas last year, it was different in that we were treated more as a group than individuals. They thought … everyone is as tired as the other because we are all doing the same thing. But it’s not like that because everyone is different.”

To develop a more detailed analysis of the Blazers, Stackpole would like to use technology to monitor the players during games. He hopes that either the NBA will change its rules to allow such equipment on the court or that technology like SportVU, which uses cameras and software to track players’ speed and other indicators in real time, will improve. Ingame data would allow Stackpole to spot gaps between practice and performance and identify a “sweet spot for training mode,” taking the team to even higher levels.

A smile that controls machines

Imagine turning on the lights, adjusting the thermostat, or operating a DVD player simply by smiling. For people who are visually or verbally impaired, or who have limited motor skills, this could be a major advance in communication. Carolyn Michener (’16) is working to make it a reality.

An undergraduate in the speech, language & hearing sciences program, Michener says her lifelong stutter and interest in engineering sparked a passion to develop technology to help others communicate. Working in the STEPP Lab for Sensorimotor Rehabilitation Engineering at Sargent College, she’s collaborating on a project to help people use facial movement and sound to work with human machine interfaces (HMIs)—controls like keypads and touchscreens through which people operate machines, systems, and devices.

“An HMI needs some kind of feedback to properly tell the user what it’s doing,” says Michener, who joined STEPP Lab Director Cara Stepp and Sargent research engineer Sylvain Favrot on the project in 2012. Often this feedback is visual—for example, a control panel flashing a colored light or displaying a message confirming that an action has been completed. “But this can be difficult for people who are visually impaired or who find the visual stimuli distracting,” says Michener. The STEPP Lab project enables people to communicate with machines through sound—no seeing or touching required. Plenty of machines already do this—such as iPhone’s Siri, which allows users to send messages or search for information—but these systems often require voice commands, which are not applicable to people with impaired speech. With the new STEPP Lab technology, users can communicate with machines by using facial movements to create sound.

Carolyn Michener (’16) (above) is working with Sensorimotor Rehabilitation Engineering Lab Director Cara Stepp to help people use facial movement and sound to control human machine interfaces. Electrodes placed on either side of the lips enable a computer to translate muscles’ electrical signals, which correspond to auditory feedback. By contracting these muscles, a user can change the location of the sound, communicating with machines. Photos by Michael D. Spencer
Carolyn Michener (’16) (above) is working with Sensorimotor Rehabilitation Engineering Lab Director Cara Stepp to help people use facial movement and sound to control human machine interfaces. Photos by Michael D. Spencer

To test the technology, Michener trained study participants in what she describes as an auditory matching game, using preexisting STEPP Lab software that Favrot modified for the project. Sitting in a soundproof booth in the lab, Michener demonstrates how the game works.

She opens communication between the player and a computer, connecting them by way of two electrodes placed on either side of the lips. This connection enables the computer to translate the facial muscles’ electrical signals from the skin, a process called surface electromyography. The player undergoes a quick calibration procedure, dons a pair of headphones, receives Michener’s instructions—and is ready to begin.

A tone plays through the headphones for two seconds. This is the sound the player will try to match. Then, a second tone sounds. This is the player’s starting point, a low pitch in both ears that represents the player’s muscles at rest. The player now has 15 seconds to match the first sound’s pitch and location (left ear, right ear, or both) by contracting his or her facial muscles in just the right combination. Contracting left or right—in effect, smirking—creates a medium pitch in the corresponding ear. Contracting both sides—smiling—increases the pitch and activates the sound in both ears. The trial ends when either the player hits the target for 1 second or 15 seconds have expired. The player then receives a score representing how well he or she matched the target.

“A [human machine interface] needs some kind of feedback to properly tell the user what it’s doing. But this can be difficult for people who are visually impaired.”

— Carolyn Michener

While the search for the target sound is an auditory task for the user, the game’s software visually records both the target location and the user’s performance on a graph Michener can review on the computer. In 2013, she tested the game on 16 adults, each of whom completed three test sessions lasting 45 minutes.

After three days, users working with auditory feedback were able to communicate at an average speed of 40 bits per minute (bpm). While this speed is 50 times slower than typing on a keyboard and 15 times slower than the quickest computer mouse use, Stepp says, participants using auditory feedback were able to communicate with machines as effectively as participants using visual feedback in similar studies. “We can conclude that auditory feedback is a viable way to allow people to communicate with this kind of system,” says Michener.

Michener cowrote a paper about the project with Stepp and Favrot that she presented at the Acoustical Society of America’s biannual conference in May 2014. She continues to run trials of the game, this time to find out if players with a musical background perform better than others. Stepp says the team is also embarking on collaborations with Madonna Rehabilitation Hospital in Nebraska and the Perkins School for the Blind in Massachusetts to see how people who are blind and individuals with spinal cord injuries perform in and respond to the game.

“Ultimately I would like to see this technology in a device that can be used inside a patient’s home,” says Michener. Patients trained to associate certain musical notes with particular tasks, for instance, could match those notes using their facial movements to adjust the thermostat, operate an electric bed, turn on the TV, or communicate needs to a caregiver. Ultimately, the ability to easily interact with various machines and devices could help patients in rehabilitation and people with disabilities communicate more effectively and live more independently.