Sudha Arunachalam was flying to Berlin for a conference in 2014 when the woman across the aisle began entertaining her two young children. “I spy, with my little eye, something…” she’d begin. Then she’d add a clue about the object she’d picked—a color or other adjective—and the children would look around and start guessing. Arunachalam, an assistant professor of speech, language & hearing sciences, might have forgotten the scene had she not been puzzling over the best way to determine how children process what their parents say. She decided to use a game-like format for her study; the subsequent findings have big implications for speech-language clinicians.
Arunachalam discovered that parents seem to have an innate sense for constructing sentences to help their children understand and learn language. The study revealed subtle speech nuances—such as where to place an adjective in a phrase—that help children learn but hadn’t been previously measured.
Arunachalam, who is also the director of BU’s Child Language Lab, believes her study is the first to measure children’s comprehension while their parents speak, to see what parents are doing naturally (without researchers’ input) to make language easier for their children to understand.
In the study, a parent and their typically developing child, aged roughly three to five, looked at images of six objects, three per row, on a computer screen. The researcher privately communicated to the parent which object to help the child find, as quickly as possible. The parent then instructed the child however he or she chose—for example, “Find the striped umbrella” to help the child distinguish that umbrella from one with dots. Eye-tracking technology revealed how quickly the child located the object.
Arunachalam was surprised that parents instinctively constructed sentences to help their children understand their meaning. When helping a child locate the striped umbrella in a slide that also contained the dotted one, parents usually placed the relevant adjective after the noun (“umbrella with stripes”) instead of before it (“striped umbrella”). This happened nearly 80 percent of the time, compared to about half the time when viewing a slide with just one umbrella. When children heard “umbrella with stripes,” they looked at the correct umbrella significantly faster than when they heard “striped umbrella,” says Arunachalam, whose findings were published in the Journal of Memory and Language in 2016.
This parental instinct seemed almost magical to Arunachalam. She theorizes that parents unconsciously recognize that young children will more easily understand “umbrella” than “striped,” or that they will understand a noun better than an adjective, and put the simplest word first. “We think this is really useful, long-term, for supporting children who are struggling with language, such as children with autism spectrum disorder (ASD) who don’t necessarily give their parents great feedback about what their language level is and how much they understand,” she says.
“We think this [research] is really useful, long-term, for supporting children who are struggling with language, such as children with autism spectrum disorder who don’t necessarily give their parents great feedback about what their language level is and how much they understand.”
Parents also seem to know how to adapt to their child’s needs, according to a related Sargent study by postdoctoral student Angela Xiaoxue He, which is funded through Arunachalam’s $150,000 Charles H. Hood Foundation Child Health Research Award. Preliminary data from this study of children ages three and a half to seven who have ASD suggest their parents “tailor their language to the child’s language level rather than the child’s chronological age,” says Arunachalam, whose study with typically developing toddlers was also funded by a grant from the National Institutes of Health.
These studies may give parents and clinicians more detailed information about how to help children understand and learn language, says Arunachalam. This is especially important for people working with children at risk for language disorders—or who have autism spectrum disorder. Children with ASD, says He, face “a major challenge” with social-communicative behaviors like joint attention—paying attention to something at the same time as someone else—that are helpful for learning language. When the research is concluded and the findings are confirmed, clinicians may be able to give parents advice, such as “talk slower, break up complex ideas into multiple short sentences, repeat the same word,” says Arunachalam.
The next stage of Arunachalam’s research will focus on children who have ASD. She says she’s seen parents of typically developing children and parents of children with ASD go to great lengths to describe the umbrella when their child doesn’t appear to acknowledge it—yet the eye tracker shows the child is looking right at it.
Some parents find it difficult to tell if their child is having trouble learning or just being a kid. “When you say to a two-year-old, ‘Go put your shoes on,’ and they do nothing, is it because they don’t know what shoes are, or they don’t feel like it, or they’re distracted by something else?” says Arunachalam. She hopes her work will reveal that all children “understand a lot more than they often show.”
Teaching Late Talkers
Understanding how children learn the meaning of words—especially verbs—is critical to helping those who struggle with language. “Verbs are particularly important for language development,” says Sabrina Horvath, a speech, language & hearing sciences doctoral student, “because they bootstrap children into more advanced language.” Horvath (’19) is studying late talkers, whom she describes as otherwise typically developing two-year-olds “who have small vocabularies and maybe aren’t combining words” at a level appropriate for their age. She hopes her study, which uses videos of objects in motion to test late talkers’ grasp of verbs, will reveal what verbal cues to use when introducing new words, and will help diagnose children with language impairments at an earlier age.
“This particular population is so important, because one-quarter of late talkers will have a formal diagnosis of a language disorder by the age of five,” Horvath says. Even those who don’t develop a language disorder have poorer academic outcomes than their peers or demonstrate atypical neural activation during language tasks, she says, and they’re “at greater risk for socioemotional disorders in adulthood.”
Horvath’s research is funded by the $2,000 American Speech-Language-Hearing Foundation Student Research Grant in Early Childhood Language Development—which is supported by the Noel and Arlene Matkin Memorial Fund—and by Sudha Arunachalam’s $25,000 American Speech-Language-Hearing Foundation New Century Scholars Research Grant.
Read more stories from Inside Sargent here.