This is the interview with Gordon Pennycook from episode 066 of the You Are Not So Smart Podcast.
Gordon Pennycook and his team at the University of Waterloo set out to discover if there was a spectrum of receptivity for a certain kind of humbug they call pseudo-profound bullshit – the kind that sounds deep and meaningful at first glance, but upon closer inspection means nothing at all. They wondered, is there a “type” of person who is more susceptible to that kind of language, and if so, what other things about personalities and thinking styles correlate with that tolerance and lack of skepticism, and why?
David: So at this point, since we’re talking about science, and we’re talking about bullshit – you might be wondering what is the scientific definition of bullshit? Well Gordon Pennycook and his team deferred to the lengthy definition put forth by philosopher, Harry Frankfurt at Princeton. It contains lines like, “The phenomenon itself is so vast and amorphous that new crisp and precipitous analysis of it’s concept can avoid being Procrustean.” It’s actually fantastic, and it’s free online. I’ll have a link for you at the website. So why would a world renowned Princeton Analytical Philosopher write a lengthy essay, attempting to define precisely what bullshit actually is? Well here’s Frankfurt’s explanation from an interview he gave to the Princeton Press a few years ago.
Harry: It was largely because I think that respect for the truth and a concern for the truth – these are among the foundations of civilization.
David: And according to Frankfurt, a lack of concern for what bullshit is, and how to avoid it, is something that can really threaten a society. Frankfurt’s 1986 essay was published as a tiny book in 2005. And in it, Frankfurt laid out exactly what bullshit is, and what it is not.
Gordon: And it became a best-seller actually. So we actually just took his definition, so we didn’t have to do it ourselves.
David: And that definition is very clear on one thing. Bullshit is not the same as lying. Though it may contain lies mixed with truths.
Harry: Well it consists in a lack of concern for the difference between truth and falsity. The motivation of the bullshitter is not to say things that are true, or even to say things that are false. But serving some other purpose. And the question of whether what he says is true or false is really irrelevant to his pursuit of that ambition.
Gordon: Bullshit is something constructed without any concern for the truth. Okay, so that’s different from lying. Lying is actually very concerned with the truth, it’s only concerned with subverting it. But in order for someone to lie, they have to think they know what the truth is. For a bullshitter, it’s irrelevant. It’s just – I think it’s just not, it’s not a part of the goals of communication.
David: Frankfurt wrote, “It is impossible for someone to lie, unless he thinks he knows the truth. Producing bullshit requires no such conviction.” So the goal of the bullshitter, according to Frankfurt and Pennycook, is to impress rather than inform. Who elevate one’s status in the eyes of others, without any concern for the accuracy or validity of the statements made toward that goal.
Gordon: The bullshitter wants to convince you of something or – in the context of Twitter to get retweets and likes and follows and stuff.
David: So if you want to study the psychology of people and the presence of bullshit, there is a hurdle you must first overcome. And that is the fact that bullshit comes in many flavors. I could’ve worded that better. So not flavors, let’s say – varieties. Bullshit comes in many varieties. And the kind of bullshitting that you do commonly and routinely with your friends, it’s just too ambiguous. It’s too lacking in hardcore vividness and too open to interpretation for scientific testing. So Pennycook and his team decided to focus on one specific kind of bullshit – pseudo profound bullshit. Which is why they turned to the program created by Seb Pearce, the one that can generate an infinite supply of randomly generated statements, that seem profound on the surface, but are actually nonsense.
Gordon: I mean because bullshit was not, has not been investigated psychologically, we wanted to find what might be like the most kind of extreme example. So the sentence, I guess they’re just random words put together. And you might be able to discern some sort of meaning if you tried hard. But there’s no – obviously there’s no intended meaning. So it’s as those things – if we tell you that anything definitively is bullshit – both sentences are bullshit, so that’s why we stuck with the pseudo profound. It’s just basically, it implies that there’s something profound when it’s the complete opposite. There’s nothing at all in the sentence.
David: With a tool in hand for creating pseudo profound bullshit, Pennycook and his team set out to create a sort of scale of receptivity. A spectrum of personality traits and test scores and responses to other kinds of information. Judgements in other realms and so on. That together could be used to predict a person’s ability to detect bullshit, and determine one’s propensity to view nonsensical pseudo profound statements as being deep and meaningful. Now you might be thinking that this is just another way to measure the relative stupidity of different individuals. But raw intelligence actually isn’t as much as a factor as you might think when it comes to following for woo woo and conspiracy theories, the supernatural and cryptozoology and just all the other forms of cryptozoology
Gordon: So in terms of intelligence, you can have like a lot of computational ability. You could be like a brilliant physicist. But if you’re not willing to think in a kind of analytic way about problems when you come across them in your everyday life, then you won’t – practically speaking – be very intelligent. So the thing about these problems – the bullshitter (5:18 people – is that they have the kind of appearance of meaning. Which means in order to kind of realize that they’re vacuous, you have to kind of stop and think about what the words actually mean in relation to each other.
David: So let’s go back to that question from before the break. If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets? This is one of the questions that Pennycook’s team asked of the research subjects, and it’s part of a test that is used in psychology called the cognitive reflection test. So if you thought about this, what did you come up with? Well if you have a normal human brain, the first thing that popped into your mind was probably 100 minutes. That’s the first thing that pops in my mind. And it’s an intuitive reaction, you see – to this pattern of x,x, x, and y, y, y. 5, 5, 5 makes you think 100, 100 and you just sort of want to plug in that last 100. And it appears in your mind without any conscious effort. It feels right because the human brain often confuses ease of thought with probability of accuracy. By the way, that’s called the availability heuristic. We can talk about that another time. So if you think about this, if you don’t trust your gut and you actually think about it – you can easily work out that 5 machines, 5 minutes, 5 widgets – means that each machine is just making 1 widget every 5 minutes. Which means if you had 10 machines, then every 5 minutes you get 10 widgets. And if you have 50 machines, every 5 minutes, you would get 50 widgets. And that means that the answer to how long it would take 100 machines to make 100 widgets is 5 minutes. Not 100 minutes. But you would have to not trust your gut to get to that answer. All the questions on the cognitive reflection test are like this. And that’s why Gordon and his team used it as a way of establishing just how analytical his subjects were before he presented them with the statements from the random generator.
Gordon: Basically they’re just kinda math problems that have an intuitive response that’s incorrect. So to get the correct answer, you have to question your initial intuition on the problem. Which feels as though it’s right. And think more about the problem. And there’s also – it also is a sort of measure of probablity because you have to be able to do the math. But the thing that’s interesting with the problems, is that the math is actually relatively simple.
David: The scientists also tested the subject’s verbal abilities, their need for cognition. Their numeracy skills, their verbal intelligence and several other traits – including something called ontological confusion.
Gordon: So basically an ontological confusion is a confusion of any 2 ontological categories. So one example would entry – extra sensory perception. That’s – I can control people’s behavior with my thoughts. Or I can – I have access to their thoughts via my thoughts or something like that. And mistaking the physical for the mental. So those are 2 ontological categories. And it’s kind of something that underlies most – or I guess all – supernatural beliefs. It’s kind of like every supernatural belief is a – there’s a– There’s 2 categories that are being inflated, I guess. So the way that we kind of measure – you give people things like, “The house has memories.” And they rate how literal or metaphorical it is.
David: So participants would rate from a scale of 1 to 5 – 1 being, “This is a fully metaphorical statement,” and 5 being, “This is a fully literal statement. Things like, “Friends are the salt of life.” Or, “Wayne Gretzky was a hockey player.” Or, “A rock lives for a long time.” And after testing many, many different people on all these different scales of personality and scales of cognitive ability, a spectrum started to emerge. And on one side you had people who were reflective thinkers. Who engaged in what you would call meta cognition. They thought about their own thoughts a lot. And on the other end of the spectrum, you have people who just sort of went with it. Who didn’t really think too much about how they thought about stuff that they were thinking about.
Gordon: The kind of grouping of variables was a re-run of prior studies. So we knew that people that are more analytical and more intelligent, tend to be less religious. And they aren’t as ontologically confused. Or they have – they may have lower paranormal beliefs and stuff like that. So it was kind of like – we just wanted to get the same – they all tell the same sort of story in a sense that people that are more skeptical, and they do better on the kind of reasoning of problems are more skeptical about things like religious beliefs. And they’ll be more skeptical about the bullshit we present them.
David: So a question comes up here, and that is – for me – and that is– So these people who tend to be more reflective thinkers, and tend to be less prone to jumping into intuitive estimations of the answers of things and – they tend to be more likely to think about their own thinking, less ontologically confused. All that stuff, do you – does that seem to be something that a person develops as a skill? Or does that seem to be something that is– That is something that they– I guess, I’m asking – is that more of a nature, nurture thing? Or is that–? Are we default bad at this stuff, and we learn to be not bad at it? Or do we learn to be bad, and learn to be good?
Gordon: Almost certainly both. But there’s no real good answer to that. Because there isn’t enough research on it. So we don’t really know what – kind of for example – the genetic basis of making dispositions and stuff like that are. There’s work on genetic base of intelligence, and there’s work on genetic base of religious belief, and all that stuff. But because these are all kind of high level kind of categories, it gets very complicated to get the collection of them. And then there’s always – the rules are violated more often than they are not. You know what I mean?
Gordon: Because people that are more analytical or less religious for example – but if I was using that to predict – if I just took religious belief by itself, and I used that to predict how smart people are, I wouldn’t be doing very well. Right? It’s not that strong of a correlation, right?
David: Yeah, that’s what I was wondering. ‘Cause there’s a tendency to think – this is just a kind of person– There’s a kind of person that’s going to end up being very superstitious, and there’s a kind of person that’s going to end up being very religious. And a kind of person who’s going to be very prone to conspiratorial thinking and that kind of stuff. And then there’s another way of looking at it. Which is thinking that – well this is a – this is a– These are thinking habits. These are tendencies of – these are the ways that you have learned to make sense of the world. And so that it’s much more– And I’m just wondering from your perspective, and you can– I understand this is going to be speculative. But I’m wondering if it’s like – do you – over time get worse at reflective thinking because of the habits that you’re building up? And versus – overtime you get better at reflective thinking because of other habits that you build up, what do you think?
Gordon: In thinking of this position, at least in my view – but like I said don’t have really the evidence for this. But it seems to be more of a learned tendency to me. And it’s probably something that’s picked up more in adolescence though, you know what I mean? And especially – I mean, we had undergrad students, and then we also had an older sample of Americans. But they’re not appreciably old, and I’m pretty sure we kind of pick up certain tendencies when we – as we learnt to first think. And then they modify slightly, depending on a bunch of different factors that I have no idea about. But we kind of – that’s – it’s, most of the stuff happens in adolescence I think.
David: Right. Like a lot – almost all like Nobel Prize winners have average intelligence. And most of the people who accomplish great things and – are people who aren’t super geniuses, they’re just people who work really hard. And they have good control – self control and work ethic, and so on and so forth. And it feels – there’s a tendency to think that people are smart or stupid. And even if we were to judge people that way, it doesn’t seem to matter all that much. And this research in particular made me feel like – this is a lot – there’s more– It’s more about a style of thinking, that I would suspect is culturally produced. And it is learned. You learn to think in a certain way. And it’s almost the story of science itself. Is that we had to learn how to think in a certain way. And before we did that, we were burning witches. And then after that– And then we – and then we went to the moon. And so and it was – it was a way of thinking. They were very– I want to say, the kind of people that went to the moon – in my mind – were equally intelligent to the kind of people who were burning witches. It was just a different way of thinking about the world.
Gordon: Yeah it was– Yeah I mean, it’s not as if our brains are like totally rewired, compared to the man of the middle ages, right? So well I think you’re onto something for sure. And that’s – I mean that’s the way I’ve been thinking since I’ve been doing this work, sure.
David: Pennycook told me that his study into what sort of people are most receptive to nonsense masquerading and deep, intellectually nutritious profundity – was meant to establish the groundwork for future study. It was meant to be a way to begin working on understanding this aspect of belief itself. And belief – strange as this may sound – isn’t all that well understood, scientifically speaking. So he urged me to remind you that this is just one paper at the beginning of a long investigation. In other words, don’t fall for that trap in science journalism that leads to headlines like, “Study finds people who fall for nonsense inspirational quotes, are less intelligent.” Which is a real headline that came out right after he published his research. Which is silly. Because the very person who created the program that Pennycook’s team used, who measure whether or not people were susceptible to bullshit – was someone who used to be susceptible to bullshit. So the reason he made the program at all, was because it once was a person who was really, really, really into that sort of language and that sort of book and that sort of talk. And he became fascinated with how it actually worked, and what was behind it. So what did Pennycook and his team find? What was the result of all this work? Well what they discovered was that – with the random sentences generated from computer programs – a little more than a quarter of the people they studied rated those sentences as being actually truly profound. And subjects were also tested to see how well they did on cognitive reflection, verbal skills – lots of other stuff. How they rated in overall religious beliefs, ontological confusion. Which was the fancy way of saying that they checked to see if they believed in the supernatural, things like ESP and ghosts. And so research has found this big correlation. That people who are most likely to consider these random nonsensical phrases as being deep and meaningful – were also people who were the most religious – held the deepest belief in the supernatural, and scored the lowest on the cognitive reflection and verbal intelligence tests. Just to be sure, subjects were also presented with statements that were just – actual. Like, “Most people enjoy some kind of music.” And statements counter to those, that were straightforward – but widely considered to actually be profound. Like, “A river cuts through a rock. Not because of it’s strength, but because of it’s persistence.” And they found that most people correctly identified the mundane statements as being mundane, and the profound statements as being profound. So what this says – what it seems to be showing us, is that most people can tell the difference between truly profound statements and truly mundane statements. When it comes to bullshit, we each fall somewhere along a spectrum. And on one end sits people who are intuitive thinkers. Who, when confronted with just the right amount of vagueness and implied deepness – will – as neurologist Steven Novella wrote, in commenting on this research. Quote, “Reflect their own belief onto the statements.” End quote. It’s somewhat similar to how we all kind of make up the meaning of certain ambiguous songs or poems or David Lynch movies. Except, in the case of bullshit – there’s usually someone benefiting from this kind of behavior, and often betting on it. So what is the big takeaway? As we come in for a landing, as we sum this all up – I thought it would be great to bring in someone I deeply respect – psychologist Barbara Drescher, and ask her what she thought about the research, and how we should digest it.
Barbara: So what they found is the people that are lower in cognitive ability are more prone to – what they call ontological confusions and conspiratorial ideation. And they’re more likely to hold religious and paranormal beliefs. And they’re more likely to endorse complementary and alternative medicine. But that – that those people who are more receptive, they’re less reflective. But those are just correlations. And they’re just correlations to ratings of profundity. So how profound they’re rating these things. But it’s about cognitive – analytical, cognitive style. So how reflective people are. But it’s – it’s not so black and white, it’s not so simple as saying, “Well smart people don’t fall for it.” Yeah they do.
David: Obviously so.
Barbara: They obviously do. There’s also – there’s no association– This is what I found the most interesting of their findings. There’s no association between bullshit sensitivity and conspiratorial ideation, except the acceptance of complementary and alternative medicine. These are very much in line – these findings are very much in line with some of the research that looks at personality factors. And finds that people who believe in the paranormal, people who believe in – people who are religious, people who believe in ghosts, people who believe in psychic phenomena. Are somewhat different in their psychological makeup, or in their personality makeup – than people believe in say cryptozoology, conspiracies, alternative medicine and things like that. And if you think about it, it makes sense. One is about the supernatural, and the other is really about how we evaluate evidence and information that’s real – and how we integrate that and make sense of it. And so these are completely different types of beliefs, or they stem from different types of processes. So it would make sense that people are different. And so you get the people – what this tells me is that people who believe in ghosts and paranormal – supernatural phenomena – on average or– There’s a loose correlation or it’s– Some of these are actually pretty high correlations. So I should qualify that. With their – the way that they separate regular statements from things that sound profound. And that may be a desire to see the world as being more meaningful and more profound, than some of– Then people like me want to see the world.
David: It’s really, really great that to – if nothing else – see that there’s a, the– I think it’s very easy to lump in conspiracy theorists and people who are homeopaths and all that kind of stuff – with people who believe in ghosts or unicorns or whatever. It’s very easy to lump it all together as that big umbrella of woo. But it’s – as with everything, once you really dig in – you’re like, “Oh no, this is very nuanced and not everybody’s the same and there’s more – it’s more complicated than that.” That was – just like you’re saying, that was a very interesting thing that came out of this research. And I understand – there’s one study, and all that stuff that goes along with that. But – and there was a lot in here about meta cognitive beard stroking and how and, how that correlates. And it seems like there’s a – they did identify there’s a spectrum of receptivity. But it’s more of – the spectrum is not a spectrum of smart and stupid.
Barbara: No, not at all. It’s really – it’s more of a spiritualness, I would think. That’s what I would– That’s the way I think about it. I think of profoundness as being kind of this spiritual feeling, or a view of the world or a– The way you want to view the world anyway. And it would make sense that it would correlate with those things.
David: It’s like – my thought on this is like – people who are really into the pyramids. Some people are into the pyramids, because they think they have energies and powers that like transcend dimensions or something. And then other people are really into the pyramids, because maybe aliens built them, and there was a conspiracy covering it all up. And that – it seems like it just makes– It’s one of those things, you can be like, “Yeah, people who believe pyramid stuff are wacky.” But there’s 2 different kinds of belief. There’s 2 different kinds of thinking.
Barbara: Yes, yeah. They’re very different. They’re very different. And the way that people analyze the information is very different. You get stuff that’s more vague, or confirmation of– Interpreting input, sensory input – is one one way. And the other is – I don’t know? It’s very numbers based in my opinion. It feels very numbers based and more cognitive. But they all think they’re being smart. But the people who believe in conspiracies and aliens and things like that – think they’re being academically smart.
David: Yeah, yeah. They think they’re like – they’re just being, they’re just–
Barbara: Science actually says this.
David: Yeah, yeah right.
Barbara: And the other ones are saying, “Screw the science.”
David: “Yeah, yeah what does science know? Science – every 5 years, they say something else is bad for you.”