A social machine for autistics

MIT researchers are testing a wearable computer that could help autistic people communicate.

J. M. Berger

We expect computers to be adept at adding numbers. But how about interpreting facial expressions?

Rosalind Picard and her colleagues in MIT’s Media Lab are developing a wearable computer that can read faces in order to help autistics—who often have trouble interpreting communication cues involving facial expression, vocal tone, and body language—communicate better. Picard and postdoctoral researcher Rana el Kaliouby recently received a $700,000 grant from the National Science Foundation to test components of the wearable computer system.

Dubbed Head Cam, the device consists of a pen-sized camera, a small computer, and a wireless earbud that together provide the user with on-the-fly interpretations of facial expressions.

Rosalind Picard, director of MIT’s Affective Computing Research Group, wears a prototype of the lab’s facial analysis system.

The camera is small enough to be worn discreetly with a hat or other headwear and points towards the face of the conversation partner. The software evaluates the person’s facial image and communicates its analysis by piping words into the user’s ear, such as “agreeing” or “interested.”

The analysis works on a principle similar to computer-aided voice recognition, Picard says. Certain combinations of facial action—such as arching an eyebrow or smiling—can be interpreted to convey meaning in the same way that combinations of sounds can be assembled into words.

“We know that these cues significantly enhance people’s ability to communicate when they read them naturally,” says Picard. “And we know that a lot of misunderstandings happen when these cues are missing or sent wrongly.”

Put to the test

Working from hundreds of hours of video, Picard’s team paired images of facial expression with the emotional state they conveyed. Ten people reviewed each facial expression, allowing the researchers to identify the expressions most strongly correlated with a particular emotion. That information forms the basis for their software.

The MIT team is selecting about 20 people with autism and 10 without for a controlled trial to fine-tune various aspects of the technology, to test its user-friendliness, and to see if such camera feedback techniques can improve the ability of autistics to read facial expressions.

Conversation piece

Gregory Abowd, a computer science professor at Georgia Tech who has developed technology for assessing autism in children, says a successful device that provides real-time aid to communication would be unique and could help “high-functioning” autistics.

Many high-functioning autistics do not appear disabled to casual observers, he says. They are often viewed as antisocial or unduly eccentric because they are unable to manage the social aspects of work and life, despite being capable in other ways, says Abowd, who has two sons with autism. In such cases, a device like Head Cam could mean the difference between holding down a job and ending up homeless, he adds.

The device, however, still has many hurdles to clear. “It’s not clear whether this will work or be effective enough, and that’s exactly why it’s the kind of work the National Science Foundation and the NIH is supposed to fund,” he says.

Leave a Reply

Your email address will not be published. Required fields are marked *