The conversation earlier today between Morning Edition co-host Steve Inskeep and NPR's Shankar Vedantam about software that can reportedly detect when a CEO might be trying to hide something during a conference call with investment analysts sent us off on a search for more about the research that Shankar was discussing.
Layered Voice Analysis technology, according to researchers from Duke University and the University of Illinois, seems to be able to pick up on the "vocal dissonance markers" in the tone of a CEO's voice that signal he or she might be shading the truth, trying to not say something or even lying. And the technology, according to a paper the researchers have produced, seems to do a better job of that than humans — the analysts taking part in such conference calls — can do.
Cognitive dissonance, as researchers Jessen Hobson, William Mayew and Mohan Venkatachalam say, "is a state of psychological arousal and discomfort occurring when an individual takes actions that contrast with a belief, such as cheating while believing oneself to be honest."
They caution that:
"LVA is an emerging technology and, as with most commercial products, its inner workings are proprietary. While our laboratory results suggest the LVA dissonance metrics capture aspects of the construct of cognitive dissonance, we are unable to document the mechanisms by which LVA is able to do so."
And where is this LVA technology coming from? The researchers turned to the Israeli firm Nemesysco. It has put some videos on YouTube to show what it's technology can reportedly do.
If this all sounds like the plot of a TV show, you're right: Fox's now canceled Lie to Me.
Correction at 2:23 p.m. ET, Feb. 3: We misspelled William Mayew's last name in the original version of this post. Our apologies. It's been corrected above.