BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Computers That Read Your Emotions

Following
This article is more than 10 years old.

By the time I was 5 feet away, the computer at MIT's Media Lab knew I was intrigued. I had approached it with eyes widened, eyebrows raised, head tilted: the physiognomy of engagement. Now, with every mug I made, a monitor labeled me confused, impressed, absorbed.

The demonstration showed off the latest in affective computing, a research field attempting to teach computers to understand and adapt to human emotion. An affective system analyzes information from cameras and body sensors and compares the data to a model that accounts for different emotional states. An affective GPS car navigation device would respond with a soothing voice when it detects a driver is stressed. Movie studios could also use affective webcams to tell when a test audience starts to tune out of a trailer.

The foundations of affective computing were laid in the 1990s by MIT professor Rosalind W. Picard, who oversees the lab with the face-tracking demo. She and research scientist Rana el Kaliouby founded a spinoff called Affectiva last year to market affective technology that the two initially developed to help autistic children understand and communicate emotion.

Commercial trials are under way on Affectiva's skin conductance sensors, wristwatch-size devices that detect emotional engagement by measuring tiny changes in sweat-gland activity. The firm plans to make its facial and skin analyses available as a cloud-based service, allowing for massive opt-in studies of human response to videos and websites. Feeling it?

Special Offer: Free Trial Issue of Forbes