When it comes to reading people, scientific studies have revealed helpful strategies for situations ranging from playing poker and identifying gonorrhea-infected people by smell alone. But this study might just prove even more useful. Here, researchers show that it is possible to distinguish between people who are faking pain and those who are actually experiencing it. And although people can be trained to improve their ability to tell the two apart, they have nothing on computer vision — apparently, when it comes to pain, computers are better at identifying when facial expressions are forced and when they are involuntary. Are we one step closer to a Torture Bot? Only time will tell…
Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions.
“In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain. Two motor pathways control facial movement: a subcortical extrapyramidal motor system drives spontaneous facial expressions of felt emotions, and a cortical pyramidal motor system controls voluntary facial expressions. The pyramidal system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers. However, machine vision may be able to distinguish deceptive facial signals from genuine facial signals by identifying the subtle differences between pyramidally and extrapyramidally driven movements. Here, we show that human observers could not discriminate real expressions of pain from faked expressions of pain better than chance, and after training human observers, we improved accuracy to a modest 55%. However, a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system’s superiority is attributable to its ability to differentiate the dynamics of genuine expressions from faked expressions. Thus, by revealing the dynamics of facial action through machine vision systems, our approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling.”