Smile, you're on candid computer

An elderly man squints at an automated teller machine (ATM) screen and the font size doubles almost instantly. A woman at a shopping center kiosk smiles at a travel ad, prompting the device to print out a travel discount coupon. Several users at another kiosk frown at a racy ad, leading a store to pull it.

Machine response to facial expressions that indicate emotions will be a commercial reality in three years, says Dave Schrader, director of marketing for e-business at Teradata, a division of NCR Corp. in Dayton, Ohio. NCR, which handles 20 billion self-service transactions annually, says users' emotions can provide a rich source of marketing intelligence for banks and retailers.

NCR is working with the Integrated Media Systems Center at the University of Southern California (USC) in Los Angeles on a project called E-Motions. The idea is to capture an image of a user's facial features and movements -- especially around the eyes and mouth and discover the underlying emotions by comparing the image against facial feature templates in a database.

Customer relationship management (CRM) systems typically collect customer data at the front end -- at a point-of-sale terminal or ATM, for example -- and pass that information to a data warehouse where it's later analyzed for marketing opportunities.

"But the new news in CRM is the idea that the gathering of all those clues about customers should be linked to an interactive system so you can go back and forth," Schrader says. "The more times you purchase, the more we learn about you and the more focused the marketing offers can be to you."

NCR and USC are experimenting with two kinds of facial feature extraction. One defines standard parts of the face and measures the distance, and the changes in distance over fractions of a second, between them. The other does something similar with small regions of the face.

But current technology sometimes produces ambiguous results. For example, the face of a person who is clearly happy can also produce strong indicators for sadness and disgust.

Researchers also must build good test samples of known emotional states. "There are no databases of happy people or sad people," Schrader says. "Also, in testing, it's hard to come up with a genuinely sad face."

And cultural problems can intrude. For example, Schrader says, in Japan, smiles often mask embarrassment.

Current technology is pretty good at recognizing six basic emotions fear, anger, joy, surprise, disgust and sadness says Jeffrey Cohn, a psychology professor at the University of Pittsburgh who is also doing research at Carnegie Mellon University in Pittsburgh. But there are thousands of combinations and variations, he says. "For instance, there are different kinds of disgust to physical stimuli, to moral stimuli and so on," he explains.

Cohn and his colleagues have defined 40 facial "action units" the smallest visually distinguishable changes in facial appearance and have compiled a database of 210 people, with 10 images of each, illustrating different combinations of action units.

Cohn says he hopes to see the technology refined so it can be used reliably to diagnose people with mental disorders and assess the efficacy of treatment. He says it might also be used as an adjunct in lie-detector tests and in security systems that attempt to identify people by their faces.

Meanwhile, IBM Corp. is working on computer recognition of emotional expressions at the Almaden Research Center in San Jose. Through its Blue Eyes project, it's developing algorithms for "affect detection" based on the position of the eyebrows and mouth corners.

IBM is also trying to perfect an "emotion mouse" that will determine users' emotional states by measuring pulse, temperature, general somatic activity and galvanic skin response. The company has mapped those measurements for anger, fear, sadness, disgust, happiness and surprise.

The idea is to have the computer adopt a working style that fits a user's personality. It might, for example, offer to present a different kind of display if it senses that the user is frustrated.

IBM says computers would be much more powerful if they had a small fraction of the perceptual ability of animals or humans. The aim of Blue Eyes is to enable humans and computers to work together as partners.

Join the newsletter!

Error: Please check your email address.

More about Carnegie Mellon University AustraliaeMotionIBM AustraliaNCR AustraliaSecurity SystemsTeradata AustraliaUSC

Show Comments

Market Place

[]