In this episode, psychologist Per Espen Stoknes discusses his book: What We Think About When We Try Not to Think About Global Warming.
In it, he describes the method he has developed for science communicators who find themselves confronted with climate change deniers who aren’t swayed by facts and charts. His book presents a series of psychology-based steps designed to painlessly change people’s minds and avoid the common mistakes scientists tend to make when explaining climate change to laypeople.
When you think about your future health, career, finances, and even longevity — you imagine a rosy, hopeful future. For everyone else, though, you tend to be far more realistic.
In other words, if you are a smoker, everyone else is going to get cancer. You’ll probably be in the that lucky portion who smokes into your 90s, or so you think. Similarly, the odds of success for a new restaurant change depending on who starts that venture. If its you, the odds are pretty good. If it is someone else, you see the odds as pretty bad.
In moments of ambiguity, we think in terms of frames, narratives, and categories.
These constructs are charged with meaning, and thanks to the associative networks they ping in our brains, labels and symbols, even colors, change the ways in which we think, feel and behave without us realizing it.
As a cognitive process, leaning on psychological metaphors to make sense of the world is invisible, involuntary, and unconscious –- and that’s why psychology is working so hard to understand it.
Confirmation bias is our tendency to seek evidence that supports our beliefs and that confirms our assumptions — when we could just as well seek disconfirmation of those beliefs and assumptions instead.
It feels like we are doing the hard work — doing the research required to build good beliefs — but since we can so easily find that confirmation, when we stop searching at those moments when we think we have made sense of the world, we can grow ever more wrong over time.
This is such a prevalent feature of human cognition, that until recently a second phenomenon has been hidden in plain sight. Recent research suggests that something called desirability bias may be just as prevalent in our thinking.
Since our past beliefs and future desires usually match up, the desirability of an outcome is often twisted into our pursuit of confirmation like a single psychological braid — and here’s the thing: When future desires and past beliefs are incongruent, desire usually wins out.
Is psychology too WEIRD? That’s what this episode’s guest, psychologist Steven J. Heine suggested when he and his colleagues published a paper showing that psychology wasn’t the study of the human mind, but the study of one kind of human mind, the sort generated by the brains that happen to be conveniently located near the places where research is usually conducted — those of North American college undergraduates.
They called them the WEIRDest people in the world, short for Western, Education, Industrial, Rich, and Democratic — the kind of people who make up less than 15 percent of the world’s population.
In this episode, you’ll learn why it took psychology so long to figure out it was studying outliers, and what it means for the future of the science.
In psychology, they call it naive realism, the tendency to believe that the other side is wrong because they are misinformed, that if they knew what you knew, they would change their minds to match yours.
According to Lee Ross, co-author of the new book, The Wisest One in the Room, this is the default position most humans take when processing a political opinion. When confronted with people who disagree, you tend to assume there must be a rational explanation. What we don’t think, however, is maybe WE are the ones who are wrong. We never go into the debate hoping to be enlightened, only to crush our opponents.
Listen in this episode as legendary psychologist Lee Ross explains how to identify, avoid, and combat this most pernicious of cognitive mistakes.
Psychology is working on the hardest problems in all of science. Physics, astronomy, geology — those are easy, by comparison. Understanding consciousness, willpower, ideology, social change – there’s a larger-than-Large-Hadron-Collider level of difficulty to each one of these, but since these are more relatable ideas than quarks and bosons and mass coronal ejections — this a science about our minds and selves — it’s easier to create eye-catching headlines and, well, to make podcasts about them.
This is the problem. Because the system for distributing the findings of science is based on publication within journals, which themselves are often depend on the interest of the general media, all the biases that come with that system and media consumption in general are now causing the sciences that are most interesting to the public to get tainted by that interest.