"We are not an emergent property of a mechanical universe but the seasonal activity of a living cosmos."
"Hidden meaning transforms unparalleled abstract beauty."
Which is more profound?
The first statement comes from Deepak Chopra's Twitter feed. The second statement was generated through a website that takes words from Chopra's Twitter feed and arranges them more or less at random, preserving a grammatical sentence structure. But, frankly, they might both strike you as nonsense — or what a new paper, authored by Gordon Pennycook and colleagues and published in the journal Judgment and Decision Making, calls "pseudo-profound bullsh*t."
The paper reports four studies in which over 800 participants recruited on a college campus or through the web were asked to rate how profound they found a variety of statements, including those above. Participants also completed additional tests to find out what makes some people more susceptible to pseudo-profound nonsense than others.
So, what do these studies reveal about individual differences in receptivity to the pseudo-profound? Or, put differently, what makes someone a good nonsense detector?
In the most compelling study, participants rated statements of pseudo-profound nonsense as well as motivational quotes, which were arguably at least a little profound. For example, one of the motivational quotes was:
"A river cuts through a rock, not because of its power but its persistence."
Participants who were good at differentiating these two kinds of statements – that is, those who tended to rate the real motivational quotes as more profound than the pseudo-profound statements — were also more likely to be analytic and reflective thinkers, and to be skeptical of paranormal and superstitious claims, like "astrology is a way to accurately predict the future," or "black cats can bring bad luck." This makes sense — the ability to differentiate the profound from the pseudo-profound, and scientific from pseudo-scientific claims, requires critical evaluation, which itself depends on analytic thinking.
But the studies by Pennycook and colleagues have been covered by the media in ways that go well beyond the actual findings. For instance, the Daily Mail's headline reads: "People who post inspirational quotes on Facebook and Twitter 'have lower levels of intelligence.'" In response to an article in the Cornish Guardian, Chopra himself Tweeted, "Dear friends a 'scientific' study shows we ( I and you ) have lower intelligence."
In fact, the studies didn't measure whether people posted inspirational quotes on social media. And while they did include measures of intelligence, only a measure of "cognitive style" — the tendency to think more analytically versus heuristically — predicted people's ability to differentiate meaningfully profound statements from the pseudo-profound foils. More canonical measures of intelligence — including measures of verbal and fluid intelligence — did not predict this ability to differentiate the profound from the pseudo-profound. Instead, two of the studies found that these measures of intelligence predicted an overall tendency to rate the provided pseudo-profound statements as profound, which could reflect a general tendency to find everything profound, or just to rate things highly, without reflecting any special receptivity to pseudo-profound nonsense.
Overblown headlines aside, these new findings do highlight the importance of developing good "baloney detection" skills (to use Carl Sagan's phrase), and suggest some characteristics that make people better and worse baloney detectors. In particular, analytic thinking and some healthy skepticism could go a long way towards helping people detect and reject the pseudo-profound. But those who fall prey to it don't necessarily have "lower intelligence." Differentiating the profound from the pseudo-profound can require more than raw smarts — it can also require special expertise and some confidence in one's own understanding.
For instance, what's the "seasonal activity of a living cosmos"? It sounds like nonsense to me, but so do some claims from quantum mechanics. I'm willing to dismiss the former as "mere baloney" and not the latter because I make different attributions in each case. In the former case, I assume there's a problem with the statement or its source: It's nonsense. When it comes to quantum mechanics, though, I attribute my lack of understanding to personal ignorance — to insufficient expertise. Some people may be susceptible to the pseudo-profound because they assume the superficial signs of profundity — big, abstract words, vague claims — are a good reason to attribute any lack of understanding to personal ignorance — not to a problem with the source.
Consider two relevant examples from past research. In one study, participants judged scientific abstracts to be of higher quality when they contained totally irrelevant math. In other studies, people were more compelled by scientific explanations that contained neuroscientific jargon or irrelevant neuroscience than by those that left it out. In both cases, experts were not so fooled. These findings don't necessarily imply that non-experts are stupid. They simply aren't experts and, therefore, aren't in a good position to evaluate whether the math or neuroscience is an important part of the story. They rely on superficial cues — it looks sciencey! — in the absence of the background knowledge required for deeper evaluation.
In the case of pseudo-profound nonsense, something similar could be at work. Some people may be wooed by fancy-sounding words about deep and abstract stuff, whether or not they amount to a coherent claim, because they rely on looking profound as a cue to being profound. They don't have the know-how or motivation to engage in deeper evaluation. That's a real failing, but it shouldn't be dismissed as a mere quirk of "lower intelligence."
If you're not convinced, consider one final statement:
"Bullsh*t is a consequential aspect of the human condition."
Profound insight? Or pseudo-profound bullsh*t? In fact, this statement comes from the paper by Pennycook and colleagues — it's the first sentence of the paper's conclusion, where they try to make the case that bullsh*t is ubiquitous, and that the ability to detect it is important.
Knowing whether or not they're onto something requires some intelligence — but also some basic knowledge of the scientific context for the claim. It wouldn't be hard for outsiders to mistake a substantive claim for nonsense.
Tania Lombrozo is a psychology professor at the University of California, Berkeley. She writes about psychology, cognitive science and philosophy, with occasional forays into parenting and veganism. You can keep up with more of what she is thinking on Twitter: @TaniaLombrozo