As dancers, this couple is no Fred Astaire and Ginger Rogers. The leader’s moves are clunky, his partner’s so tentative that she’s constantly behind a beat. But be kind: they’re beginners at salsa, and they’re bedeviled by something Fred and Ginger never faced.
H. Kayhan Ozcimder (ENG’11,’15), a dancer with the , has had the inelegant experience of dancing with one of these machines, which resemble a vacuum cleaner minus the hose. Ozcimder dreams of a more agile automaton someday, but for now he’s pleased to have helped program these salsa-bots, proving that “it’s possible to do an art form in a robotic platform.”
Ozcimder is a graduate student in John Baillieul’s , whose mission, says the College of Engineering mechanical engineering professor, is to give machines the ability to respond to their environment. The researchers began by mapping the coordinates of actual salsa dancers and programming the robots with four basic beginner moves (relying on his dancer’s knowledge, Ozcimder suggested salsa as a simple starting point for the mechanized dance amateurs). The robots, which are outfitted with motion sensors, read each other’s moves and respond according to the programming.
Ozcimder thinks motion-reading robots might someday serve as useful tools for judging dance competitions (possibly bouncing Kirstie Alley even sooner from nuclear plant after the 2011 Japanese tsunami).), but Baillieul is hunting bigger game. He’s not out to help “some high school guy who had trouble getting a date, so you get a robot. The ultimate goal is to understand human reaction to gestures and how machines may react to gestures.” That could enable robots to team with, and perhaps take over from, humans in hazardous jobs, from treacherous rescues to repairs in lethal environments (think the workers who plunged into the stricken
The intelligent mechatronics lab is littered with things from dancing robots to flight vehicles. The work builds on an established fact of 21st-century life: computing machines will do more of the work. “Everyday objects like automobiles have gone from almost entirely mechanically engineered things to being machines that are basically controlled at every level by computers,” notes Baillieul. “A typical automobile now has 100 or more microprocessors in it.”
The challenge is to build machines that can perform tasks with some autonomy and respond in fluid situations they might not have been precisely programmed for, an instance where man still has it all over machines. Whereas human reaction is the child of several parents—instinct, surely, but also the ability to learn from experience and sometimes override instinct—robots are not yet agile enough to ignore their “instinct” (programming). The solution, says Baillieul, is to give the machines sufficiently “massive experiential data sets” that they can react to numerous situations.
One avenue the lab is exploring is humans’ use of nonverbal cues to communicate. Good dancers move seamlessly together, responding to each other’s touch and motions; amateurs without experience reading each other’s cues often come off looking stilted. Nonverbal cues can also be used to send misinformation; bats, for example, camouflage their motions so that they can sneak up on insect prey, a fake-out familiar to anyone who’s tried to swat a pesky fly. Hence the lab’s work with getting robots to use sensors to read each other’s metal-body language, aimed at “how you might program flying vehicles or mobile robots to do the right thing, in terms of communicating or not communicating through their motions,” Baillieul says.
Dance companies like Ozcimder’s can rest easy; even he doesn’t foresee automating human dancers out of a job. Robots may be geniuses at detecting footwork, body angles, and other technical metrics that go into a performance, but they can’t judge the intangible artistic panache that might please an audience, like dancers’ facial expressions.
Ozcimder has bad news for our mechanized friends: intangibles make up half the judging criteria at a typical salsa competition.