Neuroscientist Uses Brain Scan to See Lies Form Dr. Daniel Langleben says lies aren't created out of thin air; instead, your brain has to think of the truth and then make a decision to do the opposite. And that activity shows up on a functional MRI scan of the brain.


Neuroscientist Uses Brain Scan to See Lies Form

Neuroscientist Uses Brain Scan to See Lies Form

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

This is the second in a three-part report.

This image shows average brain regions for 22 subjects during testing. Blue areas represent brain regions more active when telling the truth, red areas, when lying. Courtesy Dr. Daniel Langleben hide caption

toggle caption
Courtesy Dr. Daniel Langleben

This image shows average brain regions for 22 subjects during testing. Blue areas represent brain regions more active when telling the truth, red areas, when lying.

Courtesy Dr. Daniel Langleben

Daniel Langleben, a psychiatrist and neuroscientist at the University of Pennsylvania, might go down in history as the man who revolutionized lie detection.

Instead of wiring someone up to a machine like the polygraph, which measures the anxiety thought to accompany deception, Langleben has skipped a step: He is looking right into the brain to track a lie while it is taking shape.

Langleben, an Israeli immigrant with ceaseless energy, had never intended to build a modern-day lie-detection machine. His interest in deception came from work he had done with children suffering from attention deficit disorder (ADD). All the research indicated that children with ADD were terrible liars because they couldn't help but blurt out the truth.

Langleben thought this might have to do with their lack of impulse control, and from that, he thought it was possible that lying was essentially harder than telling the truth. One had to have good impulse control to lie, otherwise the truth came out first. That led to developing a way to track a lie as it is formed in the brain using a functional magnetic resonance imaging machine, or fMRI.

"The key point is that you need to exercise a system that is in charge of regulating and controlling your behavior when you lie more than when you just say the truth," Langleben said. "Three areas of the brain generally become more active during deception: the anterior cingulated cortex, the dorsal lateral prefrontal cortex and the parietal cortex."

The anterior cingulated cortex is thought to be in charge of monitoring errors. The dorsal lateral prefrontal cortex is thought to control behavior. The parietal cortex processes sensory input. What Langleben and his team were seeing when they looked at the fMRI scans was more blood flow in those parts of the brain, indicating they were working harder.

fMRI in Action

The fMRI machine is essentially a regular MRI machine souped up with a computer program and mathematical formulas that manipulate the pictures the MRI is taking. The machine itself looks like any hospital MRI: There is an enormous white band magnet and a stretcher on rollers that slides the patient inside.

We know from middle-school science class that every atom contains electric charges — negative for electrons and positive for protons. The fMRI essentially magnetizes certain molecules in the brain and makes them resonate. Using complicated mathematical formulas, Langleben and his team can get the fMRI to look at blood carrying oxygen or blood without it and how it moves around the brain. A computer then captures those brain images in real time.

What makes the functional MRI machine so special is that it synchronizes images of the brain with the specific actions associated with them, so people like Langleben can actually see the brain at work.

It is by studying the fMRI pictures that Langleben has come to the conclusion that lying increases blood flow in key areas of the brain. As he sees it, lies aren't created out of thin air. Instead, he believes your brain has to think of the truth and then make a decision, in a sense, to do the opposite. If you are instructed to say "the sky is green," Langleben believes your brain first thinks about the sky's true color, blue, before going with the falsehood. That process shows up on the fMRI scan.

No Lie

If Langleben is all about the science of lie detection, Joel Huizenga, the president of a company called No Lie, is all about the business of doing so. His sales pitch is simple: "What we are able to do is look inside people's brains and verify that they are telling the truth."

Huizenga's hyperbole is that of a salesman. If what he claims is true, imagine what it would mean for law enforcement. Why would they need to turn the screws on a suspect if they could just put him into an fMRI machine and find out for sure whether he is lying or telling the truth? Some scientists say it isn't that easy. The technology isn't there. The tests done so far with the functional MRI are carefully scripted. The technology requires a willing subject who will lie still and, essentially, who wants to be imaged. And, at this point, Huizenga said people who fall into that category seem to be interested in just one thing: sex.

"We have had a huge number of people contact us with regard to sexuality," he said. "In other words: 'I am being faithful to my partner, but he doesn't believe me.' That's a common complaint. Interestingly, it is mostly women who are calling and asking to do this."

Huizenga says No Lie has received hundreds of requests from people in relationships who want to pay $10,000 for an fMRI scan that proves their fidelity. And Huizenga is the first to admit this wasn't what he had in mind when he licensed Langleben's fMRI technology from the University of Pennsylvania.

He thought No Lie would attract more people like Harvey Nathan, a South Carolina businessman who was accused of burning down his own deli to collect the insurance money.

Nathan was accused of setting fire to his restaurant in 2003. A judge eventually dismissed the charges, but Nathan's insurance company wouldn't pay up. Nathan read about the fMRI in the newspaper and thought it could provide proof positive that he had nothing to do with the deli fire.

Nathan had thought about subjecting himself to a polygraph, but experts he talked to said the machine was too unpredictable and could end up indicating he was lying even when telling the truth. Most courts don't view the polygraph as a scientific test, so taking one — and getting some inconclusive result — wasn't worth the risk, Nathan concluded.

Because the fMRI seemed more scientific, last year Nathan flew to California to climb into the No Lie fMRI and answered questions about the fire. The scan indicated that he wasn't lying and passed the test.

"If I hadn't passed, I would have jumped from the 17th floor window of the hotel where I was staying," Nathan said. "How could I have gone back to South Carolina if I hadn't passed?"

Buoyed by the results of the brain scan, Nathan said he is planning to do two more with other companies. He also plans to go for a polygraph test.

"Once that is all done, then I would have no problems walking into a court and saying with all of this, how can it be doubted?" he said.

Ethical Issues

It should come as no surprise that with all this talk of looking into the brain to see lies, government agencies are interested in this fMRI technology as well. Imagine if the FBI or the CIA had a test that could say definitively if someone were lying. Imagine not just how that could help uncover spies – the polygraph is still in use to do that — but also how it could be used with terrorists and common criminals. Why would you need coercive interrogation techniques if you could watch a lie forming in the brain?

No Lie's Huizenga said he wants to stick to the exoneration business.

"We just want to test individuals who want to be tested, and test them in areas they want to be tested," Huizenga said. "If government agencies want to play that game, we're more than willing to do testing for them. But what we see is that the government agencies would rather be in complete control of the things they do."

For all its early promise, the fMRI isn't perfect. For one, the machine is expensive. And it can be squirrelly. One swallow or just one tightening of the muscles in the neck can throw off the whole test. Nathan, knowing this, said he had never been so still in his life as he was during his fMRI test.

The point for those who see a broader application, though, is that someone who doesn't want to be tested, like a terrorist, for example, isn't a candidate for this technology as it is now. They would just have to swallow to make the test null and void. And what about people who actually believe the lies they are telling? The fMRI wouldn't be able to identify them.

And, as this test becomes better known, ethical questions are surfacing. Companies like No Lie are selling a service, not medicine. So what happens if a medical problem presents itself?

"What happens if No Lie or one of those companies images someone and finds a brain tumor?" said Paul Root Wolpe, a bioethicist at the University of Pennsylvania who has been working with Langleben as he has developed the technology. "No Lie is not a medical institution. They don't have the resources to take care of that. That is something that needs to be carefully thought out before any of these technologies are used in the public."

And public use is the next natural step. But so far, the studies have been limited. The functional MRI has largely been used on undergraduates in research settings. It hasn't been tested on criminals, con men or good poker players, so it is unclear whether the technology would work on everyone and, more fundamentally, whether it is the lie detector everyone is waiting for.