Saturday, April 15, 2006

Face Reader Bridges Autism Gap

Some new interesting news about something that could help those with high-functioning autism. :)

By Eric Smalley|

CAMBRIDGE, Massachusetts -- You are a mind reader, whether you know it or not. You can tell just by looking at a human face whether the person is concentrating, confused, interested or in agreement with you.

But people afflicted by autism lack this ability to ascertain emotional status -- it's one of the signature characteristics of the disease. Help could be on the way for autistic individuals, though: A novel computer-vision system developed at the Massachusetts Institute of Technology could do the mind reading for those who can't.

Two MIT researchers wore tiny cameras mounted on wire rods extending from their chests to demonstrate the Emotional Social Intelligence Prosthetic, or ESP, at the Body Sensor Networks 2006 international workshop at MIT's Media Lab last week. The video cameras captured facial expressions and head movements, then fed the information to a desktop computer that analyzed the data and gave real-time estimates of the individuals' mental states, in the form of color-coded graphs.

The system's software goes beyond tracking simple emotions like sadness and anger to estimate complex mental states like agreeing, disagreeing, thinking, confused, concentrating and interested. The goal is to put this mental state inference engine on a wearable platform and use it to augment or enhance social interactions, said Rana el Kaliouby, a postdoctoral researcher at the Media Lab.

"This is only possible now because of the progress made in affective computing, real-time machine perception and wearable technologies," she said.

The researchers are developing an outward-facing version of the ESP system with a cap-mounted camera connected to a wearable computer. People with autism spectrum disorders have a hard time determining others' emotions or even whether someone is paying attention to them. The system is designed to provide that missing information. Feedback could be visual or auditory messages describing the target person's mental state. It could also be tactile, like a vibration that cues the user to ask a question or move on to a new topic of conversation, said el Kaliouby.

Software featuring video clips or animated talking heads has been used for years to help people with autism learn to read faces. The MIT researchers want to go a step further to help people with autism learn about emotions and facial expressions in the context of their daily lives, using faces that are meaningful to them, said el Kaliouby.

The researchers are working with the Groden Center, a nonprofit educational and treatment center in Providence, Rhode Island, to organize a six-month test of the system with a group of adolescent boys with Asperger syndrome, which is similar to autism but milder.

In addition to the psychosocial prosthetic possibilities, the ESP system could help autism researchers collect data in the real world and quantify aspects of social behavior, such as how long a person with autism looks at other people's faces, said Matthew Goodwin, research coordinator at the Groden Center.

Though recent fears of an autism epidemic appear to be overblown, researchers generally hold that the disorder is becoming more prevalent, said Goodwin. The number of people with autism is difficult to pin down, but one in 500 children is a reasonable estimate, he said.

The ESP system also has potential as a personal relationship management tool, said el Kaliouby. "Suddenly you are aware of what faces you make during a conversation with your partner," she said. "Do you do enough eye contact? Are you always frowning or disagreeing?"

No comments: