Skip to Main Content
Boston University
  • Bostonia
  • BU Today
  • The Brink
  • University Publications

    • Bostonia
    • BU Today
    • The Brink
  • School & College Publications

    • The Record
Other Publications
The Brink
  • Sections
Pioneering Research from Boston University

A Smile That Controls Machines

Speech student's research helps people use technology through facial movement

Carolyn Michener (SAR’16) (above) is working with Sensorimotor Rehabilitation Engineering Lab Director Cara Stepp to help people use facial movement and sound to control human machine interfaces. Photos by Michael D. Spencer

December 3, 2014
  • Julie Rattey
Twitter Facebook

Imagine turning on the lights, adjusting the thermostat, or operating a DVD player simply by smiling. For people who are visually or verbally impaired, or who have limited motor skills, this could be a major advance in communication. Carolyn Michener (SAR’16) is working to make it a reality.

An undergraduate in the speech, language, and hearing sciences program, Michener says her lifelong stutter and interest in engineering sparked a passion to develop technology to help others communicate. Working in the STEPP Lab for Sensorimotor Rehabilitation Engineering at Boston University’s Sargent College of Health & Rehabilitation Sciences, she’s collaborating on a project to help people use facial movement and sound to work with human machine interfaces (HMIs)—controls like keypads and touchscreens through which people operate machines, systems, and devices.

“An HMI needs some kind of feedback to properly tell the user what it’s doing,” says Michener, who joined STEPP Lab Director Cara Stepp and Sargent research engineer Sylvain Favrot on the project in 2012. Often this feedback is visual—for example, a control panel flashing a colored light or displaying a message confirming that an action has been completed. “But this can be difficult for people who are visually impaired or who find the visual stimuli distracting,” says Michener. The STEPP Lab project enables people to communicate with machines through sound—no seeing or touching required. Plenty of machines already do this—such as iPhone’s Siri, which allows users to send messages or search for information—but these systems often require voice commands, which are not applicable to people with impaired speech. With the new STEPP Lab technology, users can communicate with machines by using facial movements to create sound.

To test the technology, Michener trained study participants in what she describes as an auditory matching game, using preexisting STEPP Lab software that Favrot modified for the project. Sitting in a soundproof booth in the lab, Michener demonstrates how the game works.

She opens communication between the player and a computer, connecting them by way of two electrodes placed on either side of the lips. This connection enables the computer to translate the facial muscles’ electrical signals from the skin, a process called surface electromyography. The player undergoes a quick calibration procedure, dons a pair of headphones, receives Michener’s instructions—and is ready to begin.

A tone plays through the headphones for two seconds. This is the sound the player will try to match. Then, a second tone sounds. This is the player’s starting point, a low pitch in both ears that represents the player’s muscles at rest. The player now has 15 seconds to match the first sound’s pitch and location (left ear, right ear, or both) by contracting his or her facial muscles in just the right combination. Contracting left or right—in effect, smirking—creates a medium pitch in the corresponding ear. Contracting both sides—smiling—increases the pitch and activates the sound in both ears. The trial ends when either the player hits the target for one second or 15 seconds have expired. The player then receives a score representing how well he or she matched the target.

“A [human machine interface] needs some kind of feedback to properly tell the user what it’s doing. But this can be difficult for people who are visually impaired.”    — Carolyn Michener

While the search for the target sound is an auditory task for the user, the game’s software visually records both the target location and the user’s performance on a graph Michener can review on the computer. In 2013, she tested the game on 16 adults, each of whom completed three test sessions lasting 45 minutes.

After three days, users working with auditory feedback were able to communicate at an average speed of 40 bits per minute (bpm). While this speed is 50 times slower than typing on a keyboard and 15 times slower than the quickest computer mouse use, Stepp says, participants using auditory feedback were able to communicate with machines as effectively as participants using visual feedback in similar studies. “We can conclude that auditory feedback is a viable way to allow people to communicate with this kind of system,” says Michener.

Michener cowrote a paper about the project with Stepp and Favrot that she presented at the Acoustical Society of America’s biannual conference in May 2014. She continues to run trials of the game, this time to find out if players with a musical background perform better than others. Stepp says the team is also embarking on collaborations with Madonna Rehabilitation Hospital in Nebraska and the Perkins School for the Blind in Massachusetts to see how people who are blind and individuals with spinal cord injuries perform in and respond to the game.

“Ultimately I would like to see this technology in a device that can be used inside a patient’s home,” says Michener. Patients trained to associate certain musical notes with particular tasks, for instance, could match those notes using their facial movements to adjust the thermostat, operate an electric bed, turn on the TV, or communicate needs to a caregiver. Ultimately, the ability to easily interact with various machines and devices could help patients in rehabilitation and people with disabilities communicate more effectively and live more independently.

Explore Related Topics:

  • Hearing Sciences
  • Share this story

Share

A Smile That Controls Machines

Share

  • Twitter
  • Facebook
  • Reddit
  • LinkedIn
  • Email
  • Julie Rattey

    Julie Rattey Profile

Latest from The Brink

  • NEIDL

    Renowned Virologist Robert A. Davey to Lead NEIDL, BU’s Infectious Diseases Research Hub

  • AI and Stolen Art

    Using AI to Identify Plundered Antiquities

  • Campus Climate Lab

    BU Students Win Janetos Climate Action Prize for Uncovering Air Quality Gaps Between Old and New Campus Buildings

  • Low Back Pain

    Finding Non-Opioid Solutions for Low Back Pain

  • Carbon Credits

    Do Forest Carbon Credits Work and Actually Help the Environment?

  • Infectious Diseases

    What’s It Like to Be an Infectious Diseases Outbreak Responder?

  • Autism

    What Causes Autism? And Is There an Autism Epidemic, as Robert F. Kennedy Jr. Says?

  • CTE

    NIH Awards $15M to BU-Led Effort to Diagnose CTE During Life

  • Research News

    Brink Bites: Tracking Endangered Frogs, Why Concentration Wanders, Studying Kids’ Beliefs

  • Economy

    Massachusetts Could See Drastic, Cascading Economic Downturn from New Policies, BU Study Finds

  • Innovator of the Year

    Pulmonologist Darrell Kotton Is BU’s Innovator of the Year

  • Expert Take

    “Everyday Discrimination” Linked to Increased Anxiety and Depression Across All Groups of Americans

  • Climate Misinformation

    Native Ads Are Shaping Climate Opinions. BU Researchers Say There’s a Way to Resist

  • Global Health

    BU Launches an Open-Source Infectious Diseases Monitoring Tool Powered by AI and Human Experts

  • Hearing Loss

    Trouble Hearing in Noisy Places and Crowded Spaces? Researchers Say New BU-Developed Algorithm Could Help Hearing Aid Users

  • Suicide

    Red Sox Player Jarren Duran’s Suicide Attempt Admission Praised by BU Trauma Expert for Helping with Stigma

  • Elections

    How Could the SAVE Act Impact Young Voters and Married People Who’ve Changed Their Name?

  • Awards

    Guggenheim Fellowships Awarded to Six BU Researchers and Scholars

  • Microbes

    Microbes Reveal Clues About Extraterrestrial Life

  • Maternal Health

    BU Sociologist and Her Students Train as Doulas to Help Inform Research on Pregnancy and Childbirth

Section navigation

  • Sections
  • Notable
  • Videos
  • About Us
  • Topics
  • Archive
Subscribe to Newsletter

Explore Our Publications

Bostonia

Boston University’s Alumni Magazine

BU Today

News, Opinion, Community

The Brink

Pioneering Research from Boston University

  • Twitter
  • Facebook
  • YouTube
  • LinkedIn
  • Instagram
  • Weibo
  • Medium
© Boston University. All rights reserved. www.bu.edu
© 2025 Trustees of Boston UniversityPrivacy StatementAccessibility
Boston University
Notice of Non-Discrimination: Boston University prohibits discrimination and harassment on the basis of race, color, natural or protective hairstyle, religion, sex or gender, age, national origin, ethnicity, shared ancestry and ethnic characteristics, physical or mental disability, sexual orientation, gender identity and/or expression, genetic information, pregnancy or pregnancy-related condition, military service, marital, parental, veteran status, or any other legally protected status in any and all educational programs or activities operated by Boston University. Retaliation is also prohibited. Please refer questions or concerns about Title IX, discrimination based on any other status protected by law or BU policy, or retaliation to Boston University’s Executive Director of Equal Opportunity/Title IX Coordinator, at titleix@bu.edu or (617) 358-1796. Read Boston University’s full Notice of Nondiscrimination.
Search
Boston University Masterplate
loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.
A Smile That Controls Machines
0
share this