Do you smile when you’re frustrated? Most people think they don’t — but
they actually do, a new study from MIT has found. What’s more, it turns
out that computers programmed with the latest information from this
research do a better job of differentiating smiles of delight and
frustration than human observers do ...
The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.
“The goal is to help people with face-to-face communication,” says Ehsan Hoque, a graduate student in the Affective Computing Group of MIT’s Media Lab who is lead author of a paper just published in the IEEE Transactions on Affective Computing. Hoque’s co-authors are Rosalind Picard, a professor of media arts and sciences, and Media Lab graduate student Daniel McDuff.
In experiments conducted at the Media Lab, people were first asked to act out expressions of delight or frustration, as webcams recorded their expressions. Then, they were either asked to fill out an online form designed to cause frustration or invited to watch a video designed to elicit a delighted response — also while being recorded.
When asked to feign frustration, Hoque says, 90 percent of subjects did not smile. But when presented with a task that caused genuine frustration — filling out a detailed online form, only to then find the information deleted after pressing the “submit” button — 90 percent of them did smile, he says. Still images showed little difference between these frustrated smiles and the delighted smiles elicited by a video of a cute baby, but video analysis showed that the progression of the two kinds of smiles was quite different: Often, the happy smiles built up gradually, while frustrated smiles appeared quickly but faded fast.
In such experiments, researchers usually rely on acted expressions of emotion, Hoque says, which may provide misleading results. “The acted data was much easier to classify accurately” than the real responses, he says. But when trying to interpret images of real responses, people performed no better than chance, assessing these correctly only about 50 percent of the time.
Understanding the subtleties that reveal underlying emotions is a major goal of this research, Hoque says. “People with autism are taught that a smile means someone is happy,” he says, but research shows that it’s not that simple.
While people may not know exactly what cues they are responding to, timing does have a lot to do with how people interpret expressions, he says, For example, former British prime minister Gordon Brown was widely seen as having a phony smile, largely because of the unnatural timing of his grin, Hoque says. Similarly, a campaign commercial for former presidential candidate Herman Cain featured a smile that developed so slowly — it took nine seconds to appear — that it was widely parodied, including a spoof by comedian Stephen Colbert. “Getting the timing right is very crucial if you want to be perceived as sincere and genuine with your smiles,” Hoque says.
Jeffrey Cohn, a professor of psychology at the University of Pittsburgh who was not involved in this research, says this work “breaks new ground with its focus on frustration, a fundamental human experience. While pain researchers have identified smiling in the context of expressions of pain, the MIT group may be the first to implicate smiles in expressions of negative emotion.”
Cohn adds, “This is very exciting work in computational behavioral science that integrates psychology, computer vision, speech processing and machine learning to generate new knowledge … with clinical implications.” He says this “is an important reminder that not all smiles are positive. There has been a tendency to ‘read’ enjoyment whenever smiles are found. For human-computer interaction, among other fields and applications, a more nuanced view is needed.”
In addition to providing training for people who have difficulty with expressions, the findings may be of interest to marketers, Hoque says. “Just because a customer is smiling, that doesn’t necessarily mean they’re satisfied,” he says. And knowing the difference could be important in gauging how best to respond to the customer, he says: “The underlying meaning behind the smile is crucial.”
The analysis could also be useful in creating computers that respond in ways appropriate to the moods of their users. One goal of the research of Affective Computing Group is to “make a computer that’s more intelligent and respectful,” Hoque says.
The work was supported by Media Lab consortium sponsors and by Procter & Gamble Co.
Source : web.mit.edu
The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.
“The goal is to help people with face-to-face communication,” says Ehsan Hoque, a graduate student in the Affective Computing Group of MIT’s Media Lab who is lead author of a paper just published in the IEEE Transactions on Affective Computing. Hoque’s co-authors are Rosalind Picard, a professor of media arts and sciences, and Media Lab graduate student Daniel McDuff.
In experiments conducted at the Media Lab, people were first asked to act out expressions of delight or frustration, as webcams recorded their expressions. Then, they were either asked to fill out an online form designed to cause frustration or invited to watch a video designed to elicit a delighted response — also while being recorded.
When asked to feign frustration, Hoque says, 90 percent of subjects did not smile. But when presented with a task that caused genuine frustration — filling out a detailed online form, only to then find the information deleted after pressing the “submit” button — 90 percent of them did smile, he says. Still images showed little difference between these frustrated smiles and the delighted smiles elicited by a video of a cute baby, but video analysis showed that the progression of the two kinds of smiles was quite different: Often, the happy smiles built up gradually, while frustrated smiles appeared quickly but faded fast.
In such experiments, researchers usually rely on acted expressions of emotion, Hoque says, which may provide misleading results. “The acted data was much easier to classify accurately” than the real responses, he says. But when trying to interpret images of real responses, people performed no better than chance, assessing these correctly only about 50 percent of the time.
Understanding the subtleties that reveal underlying emotions is a major goal of this research, Hoque says. “People with autism are taught that a smile means someone is happy,” he says, but research shows that it’s not that simple.
While people may not know exactly what cues they are responding to, timing does have a lot to do with how people interpret expressions, he says, For example, former British prime minister Gordon Brown was widely seen as having a phony smile, largely because of the unnatural timing of his grin, Hoque says. Similarly, a campaign commercial for former presidential candidate Herman Cain featured a smile that developed so slowly — it took nine seconds to appear — that it was widely parodied, including a spoof by comedian Stephen Colbert. “Getting the timing right is very crucial if you want to be perceived as sincere and genuine with your smiles,” Hoque says.
Jeffrey Cohn, a professor of psychology at the University of Pittsburgh who was not involved in this research, says this work “breaks new ground with its focus on frustration, a fundamental human experience. While pain researchers have identified smiling in the context of expressions of pain, the MIT group may be the first to implicate smiles in expressions of negative emotion.”
Cohn adds, “This is very exciting work in computational behavioral science that integrates psychology, computer vision, speech processing and machine learning to generate new knowledge … with clinical implications.” He says this “is an important reminder that not all smiles are positive. There has been a tendency to ‘read’ enjoyment whenever smiles are found. For human-computer interaction, among other fields and applications, a more nuanced view is needed.”
In addition to providing training for people who have difficulty with expressions, the findings may be of interest to marketers, Hoque says. “Just because a customer is smiling, that doesn’t necessarily mean they’re satisfied,” he says. And knowing the difference could be important in gauging how best to respond to the customer, he says: “The underlying meaning behind the smile is crucial.”
The analysis could also be useful in creating computers that respond in ways appropriate to the moods of their users. One goal of the research of Affective Computing Group is to “make a computer that’s more intelligent and respectful,” Hoque says.
The work was supported by Media Lab consortium sponsors and by Procter & Gamble Co.
0 σχόλια:
Post a Comment