This is the transcript for episode 122 of the You Are Not So Smart Podcast.
Download – iTunes – Stitcher – RSS – Soundcloud
In this episode, we spend time with political scientist Lilliana Mason and psychologist Dan Kahan, two researchers exploring how our tribal tendencies are scrambling public discourse and derailing so many of our best efforts at progress — from science communication, to elections, to our ability to converge on the truth and go about the grind of building a better democracy.
Lilliana Mason is professor of Government and Politics at the University of Maryland where she researches partisan identity, partisan bias, social sorting, and American social polarization. She is the author of Uncivil Agreement: How Politics Became Our Identity, and her work has been featured in the New York Times, the Washington Post, CNN, and National Public Radio.
Dan Kahan is a professor of law and psychology at Yale Law School were he studies risk perception, criminal law, science communication, and the application of decision science to law and policymaking. Today he is a member of the Cultural Cognition Project, an team of scholars “who use empirical methods to examine the impact of group values on perceptions of risk and related facts.”
David McRaney: In the 1970s, a psychologist named Henri Tajfel develop something called social identity theory which basically said that when we define ourselves, we do so in large part by asserting our loyalty to the groups to which we belong. Tajfel developed this theory when in his research he discovered it didn’t take very much for humans to organize themselves into groups, and once they did, they immediately began to act like assholes to people who were in groups that they were not. Tajfel’s experiments showed that humans can enter into us-versus-them thinking in seconds, and they will do so over just about anything.
Lilliana Mason: “Through a number of experiments and decades of research what he eventually discovered was that first of all the more intense conflict is the more you think of your competitor as outgroup member not as an individual.”
David McRaney: That’s political scientist Lilliana Mason.
Lilliana Mason: I’m Lilliana mason. I am a political scientist in the department of government politics at the University of Maryland College Park, and I specialize in political psychology and American political behavior.
David McRaney: Well you must be very busy then.
Lilliana Mason: Yeah, yeah, I know, some people keep saying, “Isn’t this an exciting time for you? It must be such an exciting time for you.” And someone recently used the analogy: when there’s an Ebola outbreak and people ask medical doctors, “Wow oh my god what an exciting time for you!”
David McRaney: As Mason mentioned, Tajfel’s social identity theory posits that not only do people form groups very easily but once they do, once their individual identity is tied up to their group membership, they tend to see people who aren’t in those groups as lesser-than in some way.
Lilliana Mason: So this is a crucial part of the theory. As conflict increases between groups, it’s much harder for you as an individual to think of anyone who you’re competing against as an individual person who may have good characteristics and qualities and good things about them about them and flaws. Instead you just think “you’re them, and therefore our conversation is over.”
David McRaney: Them. It’s a powerful word. And the research in both psychology and neuroscience says that because our identities have so much to do with group loyalty, thinking in terms of us versus them is an essential property of the human brain, and if activated if stimulated, we can’t help ourselves but to think in a tribal way.
Lilliana Mason: It’s a very natural and kind of primal psychological response. It’s not anybody’s fault it’s just the way that our brains work.
David McRaney: Tajfel grew up in Poland, and as a Jewish man he had become fascinated with answering the question of how one group could hate another group so much that perpetuating genocide, could become reasonable become reality. He studied prejudice throughout the 1950s, and at the time the assumption across most of social psychology was that war-like animosity between groups was based on aggressive personalities rising to power or a history of conflict over land, or grievances over a perceived slight. Tajfel was skeptical. He looked across many different examples of conflict and genocide, and he thought the differences that people claimed as the source of their hatred seemed arbitrary, rooted in meaningless categorical differences, not world-defining, ideological differences. Tajfel wondered what would happen if you could strip away every single salient difference between two groups of people so that there were no biases or prejudices between them. That is, if you made people into nothing more than a group A and Group B, or even strip that out — make people just Group Red and Group Blue. And then add one small meaningless difference at a time, like one group whereas hats, and the other doesn’t. At what point would people start showing animosity for the other — for them — when would they show favoritism for their group and bias for their outgroup? If he could find that point he thought it would establish a baseline for prejudice and discrimination. But what he discovered was that there is no baseline. Any noticeable difference of any kind will reliably stimulate the behavior that flows from tribal psychology. For instance, in one of his studies, he gathered subjects from a school and Bristol, people with identical backgrounds, many of them friends, and then sorted them randomly into two groups, but he didn’t tell them that the sorting was random
Lilliana Mason: And what they did was barricaded hundreds hundreds of times, what they did is actually just tell people in a lab… one of them was this page full of dots and asked me how many dots there are, and then randomly people were assholes. Either they’re overestimators or underestimators.
David McRaney: Each person looked at a page of 40 dots for half a second and then estimated how many they had seen. Now Tajfel didn’t really record their answers; it was a coin toss that determined whether you were told you were an overestimator or or underestimator . But with that you became a member of the overestimator her group, or the underestimator.
Lilliana Mason: The person alone at this point. There are no other people around.
David McRaney: So alone and now sorted into a group. Tajfel then asked each subject, since he was already there could he help Tajfel’s Team with another experiment. He said he wanted to see how people made difficult choices, and among those choices he presented were these allocations, splits of a given amount of money that would go to other participants in other experiments. For each amount, the money would not be split evenly but a little bit more would go to one group than the other. The twist was that the groups in these other experiments completing tasks to earn rewards were labelled as overestimators or underestimators. What Tajfel found was that even though there was nothing binding over estimators or under estimators to the groups, nothing binding those people to any sort of group identity other than those labels, I mean they just been told they were in one of the other moments ago, and even though they weren’t getting any money themselves, subjects tended to split the money in favor of their own group every time, even when that meant the people in the other experiment would receive less money overall — including members of their own group.
Lilliana Mason: Essentially what people did was not only did they give more money to their own group than it did to their own group, but they were given a choice in some of these experiments between let’s say everybody gets five dollars, the underestimators and the overestimators, everybody gets five dollars, or your group gets four but the other group gets three. So essentially they’re choosing between the greater good of everybody and their group’s directory at a cost.
David McRaney: This was a shocking finding for Tajfel, and he repeated this experiment in a number of different ways, and he found the same thing every time. For instance, if people looked at paintings by two abstract artists who had similar styles, and then subjects were told they were in a group of people who liked Artist A or they were in a group that liked Artist B, people within do all sorts of things in follow-up experiments that showed favoritism for people that liked the same paintings that they had, and they showed bias for people that didn’t like those paintings. Going even more simple, Tajfel told people ahead of time they would be sorted randomly into groups with names like Group 40 and Group 15, and still knowing they were randomly assorted, people exhibited favoritism toward their imaginary ingroup and bias toward their imaginary outgroup.
After many experiments built on these foundations, social psychologists found that there is simply no salient, shared quality around which opposing groups will not form. And once they do form people in those groups immediately begin exhibiting tribal favoritism, tribal signaling, tribal bias, and so on. And that’s why this is called the minimal group paradigm. Humans not only instinctively form groups, they will form them over anything — no matter how arbitrary or minimal or meaningless.
One of the amazing takeaways from this work is that the origin of opposing tribes and groups and parties and any clustering that leads to partisanship is often something random and out of the individual member’s control, where they were born, the religions they inherited, the schools they attended, the food they eat or don’t eat. And these starting conditions are like specks of dirt around which forms cultural pearls.
Whatever political disagreements two groups may have, whatever tribal disputes they may have, those battles might not really be about the issue in question. Instead the opposing positions are often merely justifications for the us-versus-them emotions that naturally flow from group formation. “Us” is a kind of cognitive glue a biological adaptation to encourage people to work together but as soon as we group up we begin expressing our baked-in tribal psychology. And so the counterpart to “us” begins to become inflamed — “them.” And as Tajfel’s research found, the first kernel of us-versus-them is ensuring that your group is better off than outsiders, even if that diminishes your own potential well-being.
Lilliana Mason: In order to have society, you have to have some reason for membership. There have to be some rules for being a part of a group of people, and if you’re part of a group of people, that group has to have boundaries. If it doesn’t have boundaries then you’re not really a group you’re just everybody, and there’s a scholar, Marilyn Brewer, who basically defined this by saying we have a psychological need for both inclusion and exclusion. So, we need to feel that we are part of something, but we also need to feel that not anybody can be part of if it in order for us to feel important ourselves. We need to feel like we’re included in some group, and that there are outsiders. There are some people that don’t get in. So there’s something that makes you special, and the underlying idea behind social identity is really self-esteem based. So we need to be part of groups to enhance our own self-esteem, to feel good about ourselves as individuals. We have to feel that we’re being accepted by group, and that group has status. And as long as that group has status, we feel good about ourselves. When the status of that group is threatened, we start feeling bad, and then we have to do things to improve the status of a group. That’s why there’s a conflict between groups, we start fighting a lot harder, because the conflict between groups is really a fight for your own self-esteem and sense of worth.
David McRaney: If you need an evolutionary explanation for this there are several to choose from. All of them have their champions and detractors. The one that I like goes like this:
Our ancestors evolved among the trees. We ate leaves, and since many mature leaves are toxic, and the primates didn’t develop the digestive power to deal with them, we developed a behavioral solution instead, which was to stick to the trees with the youngest leaves. Now, depending on young leaves as a food source requires a large territory, and so leaf-eating primates became very territorial. Since we already lived in troops, and since there aren’t enough leaves to go around in a single territory, that territorial disposition led to tribal behavior being favored by natural selection. It gave our genes an advantage — first over the other animals and then over the other groups of our own species, and defending against your own requires kinship, and that requires identification. And so we evolved the ability to tell our group members from others through what back in the day might have been smells and hoots and barks. Today, well we signal tribal identity in all sorts of ways. On Facebook with memes, on Twitter with outrage retweets, on Medium with hot takes about wedge issues, and so on.
The weird thing about this, and the reason we’re talking about it in this episode, is that Tajfel’s work shows that literally anything can become a tribal badge. And that means things that should be evidence-based like — Does tighter gun control lead to fewer gun deaths? — often leave the realm of evidence and enter the realm of tribal signaling. The power of modern media and modern social media has allowed humans to signal their tribal loyalties on a scale that has never ever been possible, and this one thing might just be what is driving polarization.
Lilliana Mason: There is a traditional version of political polarization which is focused entirely on the extremity of our attitudes and how much we disagree. That has been, not replaced, but there is sort of a new version of polarization that political scientists have been talking about which is called variously affective polarization or social polarization, which basically doesn’t take into account how much we disagree at all. It focuses entirely on whether or not we like each other as partisans, or whether or not we can get along with each other as partisans. So that’s the type of polarization that I study.
David McRaney: Just in case you’re not aware, according to the Pew Research Center, Republicans and Democrats in the United States are more divided along ideological lines than at any point in the last few decades. Many Western countries are abandoning their unified cultural identities and dividing themselves into two mental tribes.
In the United States, after the Civil War, if you self identified as a conservative, you still tended to hold a lot of liberal values. And if you were a liberal, you still tended to hold some conservative values. But the trend since the 1970s has been a steady clustering of our personal realities into two homogenized camps. In 1994, on a graph, the median positions on the left and the right were almost side-by-side. But by 2014, the medians had split and moved away like a cell dividing. Today, 92 percent of Republicans are to the right of the median Democrat, and 94 percent of Democrats are to the left of the median Republican.
And this has led to a debate in political science over what exactly is happening here. Is this skinny strip in the middle between left and right holding an ever smaller group of moderates, or are we seeing this whole thing through the wrong sort of lens? Some scientists, like Mason, are skeptical that the population of moderates is dwindling. Instead of there being fewer people who wait to see what elites and neighbors have to say about complex issues before they commit, instead of people becoming increasingly extreme in their opinions to the point that they are disagreeing all the time, they say there’s something else at play.
Lilliana Mason: What my research has essentially done is jump into that debate and say we’re not talking about the right thing. We can be polarized, but it doesn’t have to be because we disagree. There are always psychological reasons that people hate each other that have nothing to do with disagreement.
David McRaney: That’s right. Mason is saying that we don’t disagree over the issues based on anything that has to do with the issues upon which we disagree. Since we tend to form tribes very easily, and often around differences that are arbitrary, and since we usually are more motivated by tribal psychology than anything else, what is happening is that more and more issues are simply leaving the realm of compromise and debate, of evidence and rational analysis, and becoming mutated by politicization, by tribal signaling, and once an issue becomes politicized, it just leaves the realm of facts and figures — it just becomes another way to tell us from them.
Lilliana Mason: So the sort-of long story short is: That’s the psychological motivation. There is just this natural thing that’s driving people to discriminate against their outgroup, regardless of content — even when it is a meaningless group. Which means that technically it’s possible for people to not disagree about anything, and still discriminate against the people that they are in competition with. THat’s what I started with when I started my research — was that assumption — it should be possible for people to agree on stuff and still not like each other just based on identity alone, and I actually did find evidence for that.
David McRaney: We are living in an age in which it has become easier than ever to transmute an evidence based issue into a political one, and once that happens the desire to be correct becomes far less-important than the desire to be a good member of your tribe.
Research shows that the more intense conflict becomes between two groups, the more that people begin to see members of the opposition as members of that group, instead of as individuals. And Pew says that about 20 percent or so of the public is so extremely hardcore liberal or so insanely staunchly conservative that they say they would never ever, ever, ever vote for anyone supporting an idea from the other side.
Pew’s research found that people in those groups wouldn’t even welcome someone from the other side into their families, and frankly they don’t want to live near them either. 27 percent of Democrats said they see the Republican Party as a threat to the nation’s well-being, and 36 percent of Republicans said the same thing of Democrats.
And there are some good explanations for this in the social science. But the strange downstream consequences of forming groups so easily and then immediately engaging in favoritism and bias based on group membership is that once one’s identity is defined by the tribe to which he or she belongs, compromise and agreement on policies and laws and decisions and judgments and notions of what is and is not true will naturally become more and more difficult as our ability to signal to others to which tribes we belong increases — and that is what we’re going to talk about after this break. I’m David McRaney, and this is the You Are Not So Smart podcast.
SEGMENT TWO
David McRaney: In July of 2017, former FBI Director James B. Comey sat down to testify before the Senate Intelligence Committee for one of the most anticipated congressional hearings in decades.
AUDIO OF HEARING
David McRaney: This testimony was important for people aligned to both political parties, because Comey had been fired two months earlier by President Donald Trump. So people on the left, many of them, believed it was because Comey was investigating possible connections between the Trump campaign and Russian attempts to interfere in the U.S. presidential election. But on the right, people thought that it was an unjust witch hunt by sore losers.
AUDIO OF HEARING
David McRaney: The buildup to this testimony was intense. There was months of vitriol between Trump supporters and Clinton supporters, and it had turned the whole thing into a sort of warped, shared fever dream of partisan acrimony — and it seemed to people on both sides that the Comey hearing would be this moment that the fever finally broke and reality rushed back into American life, like a cool damp towel on our collective forehead. All across the world, people tuned in to watch this live. All across the Internet social media burned white hot in anticipation, and an every time line people posted a running commentary of his every answer, every twitch and gesture, and every word he uttered went out over live video on television and the Internet, and the moment he was done speaking, those videos went up online for everyone to rewatch, and the transcripts went up shortly after, and soon the nation was scrutinizing every nuance of his time before the Senate. So there can be no doubt about what he actually said, except there was.
AUDIO OF PUNDITS DISAGREEING OVER WHAT THE HEARING PROVED.
David McRaney: A few days out, Reddit user, “it’s not the network,” summed up the sentiment of everyone watching all this coverage. And this user wrote: “I learned three things from watching all the TV coverage, Twitter coverage, Facebypages pages and Reddit coverage. One — the Republicans think Comey’s testimony exonerates the president. Two — the Democrats think Comey’s testimony condemns the president. And three — politics makes me want to shoot myself in the face with a bazooka.” Everyone heard what Comey said. Everyone had the same facts, and yet everyone heard him say something different.
And it’s moments like these that have led to that feeling of epistemic dread all across American politics and world politics. The idea that facts just don’t work on people. Politicians are openly lying about things that people can confirm with one Google search with their phone in their pocket. And yet their supporters don’t seem to care. Once Facebook realized that Russian trolls were seeding conservative news feeds with weaponized click bait, Apple CEO Tim Cook told the world that fake news was quote “killing people’s minds,” and then that term — fake news — it mutated from a rebranded way of talking about propaganda to just anything that people wished wasn’t true. And then came this Comey hearing, and every news story for or against one side or the other led to a similar outcome. Same facts, different realities, and to top off the hysteria, in late 2016 the Oxford University Press dictionary named “post truth” it’s international word of the year citing a 2000 percent increase in its usage during coverage of Brexit and the U.S. presidential electio. Commenting on that announcement, The Washington Post lamented, “its official, truth is dead, facts are passe.”
This idea that facts facts alone can save the world is a very old misconception. We’ve talked about this a few times in the show. The 19th century rationalist philosophers thought that public education would enhance democracy by eliminating all superstitions. Benjamin Franklin thought public libraries would make the common man as educated as the aristocracy and thus empower the public to vote for their best interests. Cyberpunk psychologist Timothy Leary thought that computers and the Internet would give people “power to the pupil” and remove the need for information gatekeepers. In each case the dream was that one day we would all have access to all the same facts, and then naturally we would all agree on what those facts meant. In science communication this is called the information deficit model, and it’s often argued as a solution to just about everything wrong with democracy. The model is simple. The reason people are wrong is because they don’t have all the facts; give them the facts and they will change their minds. It springs from the misconception that anyone who reads the things that you have read or sees the things that you have seen will interpret them the way you interpret them. In psychology that idea is called naive realism, and it’s the reason so many fact-based professionals assumed that to end partisanship and political polarization all we need to do is pump as many facts as possible in the public square. Ironically the fact that this has never worked has yet to alter the belief that one day it will.
That’s not to say that facts never change people’s minds. Some incorrect beliefs can be changed with acts alone, with evidence. For those kinds of beliefs, like “It’s going to rain on Sunday,” when we learn new information, we update our priors using something called Bayesian reasoning. Basically, to think like a Bayesian, is to imagine your beliefs as a percentage of confidence instead of simply true or false. So instead of saying, “I believe my Hamster is alive and well,” you would say, “I am 70 percent sure that my hamster is alive and well based on the evidence available to me at this time.” If we were motivated by the pursuit of accuracy above all else, Bayesian reasoning would be how we updated all of our beliefs, but we aren’t, and it isn’t.
When psychologist Brian J. Gaines and his colleagues tracked the fact-based beliefs of both Democrats and Republicans during the Iraq war, they found that over the course of that war both sides did indeed update their beliefs as new evidence emerged — so much so that within the first few years, Democrats and Republicans had converged on what was and what was not factually accurate. For instance, when weapons of mass destruction weren’t found, there was no debate over whether that was or was not true, not after sort-of the craziness of all of it died down. When troop and civilian deaths began to climb, again, no disagreement on those numbers. But as groups converged on the facts, they began to diverge on their interpretations. Republicans inferred that the weapons must have been moved or destroyed. Democrats became convinced they never existed in the first place. Republicans said the troop casualties were low for this kind of conflict, and Democrats said they were shockingly high. Their beliefs updated, no problem, taking into account new facts, replacing misconceptions, plugging holes in their knowledge. But their attitudes and allegiances remain fixed. Their opinions for or against the war remain unchanged. And so their interpretations strongly differed.
Now, this was not a new finding in psychology because one of the first studies into perception had people from Dartmouth in Princeton rewatch a violent and contentious football game in which both sides racked up injuries. Psychologist at both universities noted that the college newspapers at each institution claimed that the other school was to blame for the overall unsportsmanship and brutality. Psychologists Albert Hastorf Dartmouth and Hadley Cantril at Princeton wondered if people had rewritten history in their memories, and so they brought students from both schools into the lab and had them watch the game again. Cantril and Hastorf were were surprised to find that memory had nothing to do with this. Immediately after watching the exact same film on the exact same events, people reported that they saw two different realities unfold. Ninety percent of Dartmouth students reported that Princeton had started the violence, and the majority of Princeton disagreed and marked half as many infractions as did the Dartmouth students. Same film, same games, same facts, different beliefs.
The Comey hearing was held up by many as an example of how we are now living in a post-truth world. But to suggest this implies there is some truth-filled paradise from which we’ve strayed. The truth is that we have always been motivated reasoners interpreting facts in a way that best meets our needs, our goals, and our goals are not always the pursuit of the truth. In a professional domain, like medicine, science, academia, and journalism, people are trained to pursue accuracy, to operate within a framework that helps them overcome other motivations. But we are not always motivated by such empirically lofty goals. Outside of fact-based professions, we are often more motivated by other goals, like being a good member of our tribe or maintaining a cohesive identity or keeping our jobs or our bonds with our family or our church, and being wrong about climate change or the moon landing or having a skewed interpretation of a political concept…well to reach these kind of goals that is an acceptable price to pay.
Dan Kahan: But as far as whether we’re in the post truth era, I’d like to know when we were in the the truth era.
David McRaney: That’s Yale psychology and law professor Dan Kahan. He studies how our perceptions are affected by the groups to which we belong.
Dan Kahan: There are these issues that become fused with identity, so that the positions actually are like badges of membership in, and vouchers of, loyalty to these identity defining affinity groups. Where you have an issue like that, the individual interest in conforming to the view of the group is going to dominate by far the interest the person has in getting the right answer.
Dan Kahan: In other words, he studies tribal psychology, also called cultural cognition, the well-documented, much-researched tendency of humans to make sense of what should be empirical fact-based matters, from global warming to the death penalty to whether Donald Trump obstructed justice, not by carefully considering the evidence and coming to logical conclusions, but by “conforming their perceptions and interpretations to the values that define their cultural identities.”
Dan Kahan: What someone believes about climate change, for example, doesn’t affect the climate at all, and nothing they do affects it either, because they’re just not consequential enough as an individual. Their carbon footprint is going to be minuscule. They’re not going to be the voter who breaks up what would otherwise be a tie in some national referendum on whether climate change exists and whether we should do something about it, or even in an election for one of their representatives. If they try to argue with people, they’ll probably have the experience to all of us have had where nothing happens. But even if they were convincing people, it wouldn’t be enough to have a big change on the issue.
David McRaney: Though Kahan says he’s sure there’s some evolutionary explanation for this, he doesn’t feel like there’s any need to dig that deeply into human psychology to understand why a person would be more motivated by tribal loyalty than by avoiding incorrect assumptions.
Dan Kahan: Anything a person believes, or any mistake a person makes about the science, in any of those domains — it’s not going to increase the risk that that person or anybody else that she cares about faces, just because as individual, an ordinary member of the public, their views are too inconsequential. But given that these issues have assumed this kind of status as a badge of membership and a kind of voucher of loyalty to he did a group, if you make a mistake inside your peer group when people are actually attending to that issue — Can I trust you? Do you have the right values? — then you could really suffer serious material and emotional harm.
We all know this story about Bob Inglis, the most conservative member of Congress who got it turned out in a primary after he said he believed in climate change and wanted to do something about it to protect his constituents. Well, if you’re if you know Floyd the Barber in the 5th District of South Carolina where he was the representative, and after you get done giving somebody to shave with a straight razor or whatever it is, you say or sign my my petition on save the polar bears from climate change, you’ll be out of a job as quickly as he was, and people face that kind of pressure all the time. And it’s something that people return recognize, that they ought to be conveying to their people the kind of signal about who they are and whose side they’re on that helps their lives go better.
David McRaney: Dan and his team at Yale call this form of cultural cognition. politically motivated reasoning. In one of his studies into this phenomenon, he had people whose self identified as being Democrats or Republicans take a look at a climate expert’s credentials. They saw the image of a man named Robert Linden, a very professional, academic-looking older gentlemen, and they read that he was a professor of meteorology at MIT, that he earned his doctorate at Harvard University, and they then learned he was a member of the American Meteorological Society and the National Academy of Sciences.
When Kahan asked these subjects if this gentleman was indeed an expert on global warming, everyone agreed that he was. Then people on both sides received one of two statements supposedly made by this expert which, in truth, were manipulations created by the scientists. For some people, he said that the research had led him to conclude that global warming was real and human-caused. For the others, he said his research led him to conclude that global warming was not real and humans were not causing anything bad to happen to the environment.
What Kahan found was that when Republicans heard that Robert Linden, professor of meteorology at MIT, believed that climate change was real and human-caused, they no longer saw him as an expert. Now they said it was clear he was a quack. Likewise, if Democrats learned Robert Linden, professor of meteorology at MIT, did not believe global warming was real. They too no longer saw him as an expert. They too saw him as a quack. Only when Robert Linden’s position matched that of their affinity group, their tribe, did Robert Linden continue in their minds to be an expert.
After conducting many studies like this, Kahan and his team concluded that if individuals are members of groups who have become polarized about a particular issue, and that polarization puts the group’s opinions at odds with scientific consensus, people will almost always go with what their group believes over what the preponderance of the evidence suggests. In another of Kahan’s studies, people were told they would be making sense of the raw numerical results of a separate bit of research that tested the effectiveness of skin cream. Now, the results were fake, and for half the subjects the cream was shown to be effective, and for the other half it was shown that it wasn’t. Kahan found that the better subjects were math, no matter their politics, the better they performed when it came to determining the effectiveness of the cream. But when those exact same numerical results were relabeled, and subjects were told the research tested the effectiveness of gun control, the better subjects were at math the worse they performed — but only if the political party they belonged to was openly opposed to what the numbers suggested. If the results suggested that gun control was effective, Republicans who were good at math became bad at math. If the results showed gun control was ineffective, Democrats who were good at math became bad at math. If their party favored the results, then once again math skills alone determined the subjects’ performance, the same as it had when the exact same results supposedly measured the effectiveness of skin cream. Kahan says that the better you are with numbers, the better you are manipulating them to protect your identity-connected, and in this case politically motivated, beliefs. Of course in the study none of the subjects had any idea they were doing this. They didn’t think their tribal loyalty was affecting their math ability. They all felt they were doing their best.
Now here’s one more example. Remember all that handwringing about economic insecurity in red states as political motivation to vote one way or the other? Recent analysis by behavioral economist Peter Atwater has found that almost all of that economic insecurity has evaporated since Trump became president, despite the fact that nothing has changed economically in those places where it was once a supposedly major concern. This suggests that people’s political behavior was driven by tribal psychology, like it usually is, but justified by whatever seems salient at the time, like it usually is. Once thier, “tribal mood” as he put it, improved, so did their feelings about the economy. And all of this points to one essential truth: Kahan says that the evidence is clear that humans value being a good member of their tribe much more than they value being correct. We will choose to be wrong if it keeps us in good standing with our peers.
Dan Kahan: There’s no contest between forming the identity affirming belief and going with what science knows. So, we shouldn’t put people in the position of having to choose between those things. If everybody’s forming their views that way, we’re really screwed, because then as a diverse society we’re less likely to converge on the best evidence of the threats that we face. So, that’s a kind of pathology both in being rare and in being dangerous.
David McRaney: Lilliana Mason’s research into the causes of political polarization has also revealed something similar.
Lilliana Mason: We gave people threats to their party’s cherished positions. So we didn’t mention the party at all. For Democrats they read something like abortion is no longer going to be legal anywhere in the country and people are going be able to marry whomever they want to marry and get health care. So when we threatened them with their their cherished positions being not done the way they want it to be done. When we threatened people’s issue positions, they got much-less angry. Their attached to their partisan groups did not increase. And their dislike for their political outgroups didn’t increase as much as it did when we threatened their party. So we actually tested: What’s actually making you mad? What is actually getting you all worked up in politics? Is it really about these issues that you that you say it is, that you actually think it is? We all think they have these very logical thought processed, and we have these very thoughtful attitudes that we’ve come to you through a long period of reasoning. But in fact, when you threaten their policy positions, people don’t get that as mad or worked up or ready to participate in politics as they do when you threaten just their party itself. So it’s really the group that’s driving people’s emotional responses and activism.
David McRaney: Now this is something we talk about a lot on this show. In fact it’s the foundations of everything you are not so smart about. We are unaware of how unaware we are, yet we proceed with confidence in the false assumption that we are fully aware of our motivations and the sources of our thoughts, feelings, and emotions. In fact, much of the time, if not most of the time, the true source of those things, the true motivations behind our behaviors, is often invisible or unknowable, or in the case of tribal psychology something we’d rather not believe about ourselves. None of us wants to think that we are simply parroting the perspectives of elites or going along with the attitudes of our tribes, but the work of Dan Kahan and Lillanna Mason and many others suggests that for many issues that is exactly what is happening.
David McRaney: There is there there’s a lot of scientists and Geoffrey Cohen — this is my favorite experiment that was ever done — he gave people a position on welfare and experimentally altered it so that either the Republicans or Democrats were saying basically the same thing on welfare. It’s an unknown issue, right? It’s welfare. And what he found was that he could get people to change their position on welfare, 100 percent, all the way to the other side of the spectrum of policy, just based on what party they were told supported that position. And the crazy thing is that after they said they supported that position, he asked them why they supported that position, and they didn’t say, “Because my party does.” They came up with other reasons. So, after being experimentally induced into holding a position that they actually didn’t agree with, they then came up with reasons that they thought they agreed with that.
David McRaney: Once an evidence-based issue has become politicized, we will choose to be wrong if it keeps us in good standing with our cultural peers. In Kahan’s studies where people are asked to demonstrate their knowledge about evolution, if they are incentivized not to do that, for instance with monetary rewards or being just asked to get the correct answers as best they can, their scores correlate with their level of education. But without those incentives, the scores correlate with their religiosity instead.
Dan Kahan: When they’re being asked about those things, they are they’re not telling you what they know. They aree telling you who they are.
David McRaney: Now, Kahan said there is something that his research tells all of us, something that we should all know, that maybe we haven’t accepted yet, and it’s that literally any evidence-based issue can become politicized.
Dan Kahan: We need more information about how that happens. If I were going to explain it to you I might start with some other issue besides climate change, just because I think climate change is one of those things where the only model that explains the opinion is as big as the opinion itself. But we can take other issues where I think I could show you how it how it was that the issue became transformed like that.
The one I think is the most instructive as a case study is the HPV vaccine for the human papilloma virus. It’s an extremely common sexual disease. Upwards of 75 percent of sexually active people in their 20s or early 30s are going to have been exposed to the virus. And it’s not only not only the leading cause, but probably the only cause of cervical cancer. It kills, in the United States about 3000 women a year. The vaccine was introduced as one to be given to middle-school girls. People know the story. Somebody comes knocking on the door and they say, “Hey, you know your daughter in the backyard over there on the swings, the 12-year-old who is going to be having sex next year? She needs to get and STD shot, or don’t bring her to school.” And they think what’s just obvious there’s going to be a culture conflict on that. The reason to doubt that though, is that at the same time that we were fighting about the HPV vaccine, the acceptance, the vaccination rate, for the HBV vaccine, Hepatitis B, which is also a sexually transmitted disease, was at 95 percent. Nobody was arguing about that one, even though it came just a couple of years before. The difference is that people learned about the HBV vaccine from their doctors. It wasn’t politicized. The HPV vaccine, however, they learned about probably by watching MSNBC and Fox News, where that message was it’s us versus them again. That occurred because the manufacturer took a very unorthodox route to try to introduce the vaccine.
David McRaney: The makers of HPV vaccine sought early approval, and they also sought to make it mandatory. Now, early approval means debate in Congress. Mandatory means debate in state legislatures. Both means that people with zero scientific knowledge raised questions about why this was a mandatory vaccine for girls instead of boys. The public then first learned about the HPV vaccine by watching reports on MSNBC and Fox News where the message was framed as a moral issue, which made that an us-versus-them issue, which made it a tribal issue.
Dan Kahan: And anything that’s before the legislature is just kind of raw meat for the conflict entrepreneur groups on both sides of these issues, the right and left. And it turned into a question of: Whose side are you on and who are you? And it just it blew up in everybody’s face. So that was a decision to take an issue that normally travels down this path where people are able to recognize what science knows regardless of their identities and put it right on the track to become one of the sad issues where we have this tension between being who you are and knowing what’s known by science.
David McRaney: Now people who are gladly allowing their children to get the HBV vaccine are opposed, completely opposed, to the nearly identically administered HPV vaccine. Now, it seems nonsensical, but again, being a good member of your tribe is more important than holding correct views, and Kahan says that the very same thing can happen to anything. Dark matter, volcanoes, Net Neutrality, self-driving cars. So, it’s in our best interest to keep every single scientific concept as neutral and bipartisan as possible, because once evidence is polluted by tribal loyalty, people can’t help but be wrong and stay wrong, even if 98 percent of scientists are telling them they should change their minds.
Dan Kahan: You know there’s a real cost to having these these debates about issues like climate change or evolution or the HPV vaccine conducted by these people who are kind of symbols of group identity, and not only that but symbols of contempt for the other. You know, even Bill Nye the Science Guy, he’s not convincing anybody with his arguments. All he’s doing is eliciting responses and injecting this kind of rhetoric himself into the discourse that makes it us-verus-them — and says of them, they are stupid and they’re evil. People see that, they’re not processing the content of the arguments; they’re processing the signal that this is one of those issues where being out of line with your group could get you in a lot of trouble. We don’t more amplification of that. We need science indication that shows people that people like them, just like them, find the science to be convincing or are using it when they can to try to improve their lives.
David McRaney: When an evidence-based issue becomes politicized people will cherry pick the evidence to best argue their tribe’s position, and even if both sides agree on what the facts say they won’t agree on what those facts mean. Tribal loyalty changes how we interpret the facts that we accept as true. And we aren’t treating tribalism as a basic human drive. But that’s what it is. But, you know, fast food lowered the cost of satisfying a drive, and we grew fat, and then we figured that out. And social media lowered the cost to exhibit tribal behaviors. And so we are growing apart, but we can figure this out too. Both Lilliana Mason and Kahan have suggestions as to what we ought to be doing with our current level of understanding. We’re only just getting to know how all this works, but they have some advice.
Lilliana Mason: It’s not all doom and gloom. People do have values. It’s the actual policy positions or implementations of policy that are much more manipulatable. But but we do have values, and we tend to just use those values, attach them to a policy position, and that’s how we get in agreement with our parties — is by finding a way to get our values in line with that. So we do have some guiding principles. We’re not just robots falling the party around.
And I would also say that you know if we recognize…I remember there was a moment during the debates, the presidential debates, when Hillary Clinton said something about ingroup bias. And I think it was Mike Pence who said something like, “How dare you accuse us of being biased,” and just blew my mind. Obviously she wasn’t saying that Republicans are biased, what she was saying is every single human being has this in them. It’s not it’s not offensive to say it, it shouldn’t be offensive to say; it’s just natural psychology. What psychologists have found actually, is with something like racial prejudice, for instance, we can fight that if we make ourselves aware of it, and if we don’t pretend to be insulted when we hear it. If we think about how everybody has racial prejudice, it’s in my mind, I know it is, it’s in everyone’s mind, but I’m going to actively fight against it, I’m going to keep the knowledge that it exists in the front of my mind, and try very hard in every interaction to make sure that I am not acting on behalf of that bias, you can fight it.
And so, one of the things that I think could be done here is if partisans admit that they have an implicit bias against their political opponents, it’s possible every single time you see one of your political opponents to you say to yourself, “Hold on a second, I know that I am automatically reacting against this person because of their party. I’m going to remind myself that this is a human being who has a family and a favorite recipe and likes to go roller skating or whatever it is — give some humanizing detail to that person and actively fight it in your own mind.
Dan Kahan: Well there are people who are genuinely interested in what science knows and find it to be fascinating. A research study that we did recently showed that people who are high in science curiosity aren’t as polarized on these issues, and they don’t display this really kind of perverse effect of becoming even more active more and more consistent with their their groups position as their science comprehension increases. I think that that’s an important thing for us to learn, that most people across diverse groups have some segment of their their members who actually just find science to be inspiring, and they’re kind of resources for us that they can to give a genuine account of why they find these issues and the knowledge we have interesting in a way that hopefully can spread to other people who are like them. Science curiosity, as as a kind of potential remedy, I think, is something that we should be be exploiting. But the truth is that most people on the sides of the issues, they don’t know anything about the underlying science. That’s that’s not really a cause for embarrassment, because there’s way more science that you have to accept in order to live well or just live than you could possibly become an expert on, so you become an expert on something else. And that’s picking out what science knows. Who knows what about what. And you’re mainly doing it within a group of who are like you because those are the people you hang out with. They’re the ones you can understand. You can you can separate out the people who are kind of full of it from the people who are giving you the to actual right information. That very process, though, is vulnerable to the kind of political dynamics that we see. But when we have the technique of sampling from within your group to figure out the truth on these issues that have fused positions with identity, then it is going to be the case that the people on both sides are going to be essentially equally ignorant of what’s going on.
If your group has it right on what scientific consensus is then just count yourself lucky, because you don’t understand what the scientist does in his or her own terms. It just happens to be that your intermediary groups managed to get you the right answer despite the assault that they’re under, and all groups have embarrassing instances where the message they’ve got from their intermediaries is false. And I think a little humility in recognizing that might well be something that can help us to to try to solve these problems.
I mean if I told the parents of dead of teenage girls that we had found that people like them, parents, are reacting toward information on the HPV vaccine that was akin to the way that sports fans respond to controversial calls about their teams players, they wouldn’t think that was funny. They would think that that’s terrible. Their identity as a parent comes first. But they were doing what they usually do and what usually works. And this happened. So we should recognize that that science literacy for the public is not a matter of knowing certain kinds of facts or being able to give certain kinds of accounts of how the mechanisms work. It’s that people will absorb through a clean science communication environment the information that we have from science, and we’re all in bad shape when that normal process gets pushed aside.
David McRaney: Kahan told me that if we want a democracy based on policies, based on facts, we must work to depoliticize evidence-based issues before they go out into the public. There are all sorts of ways we might could do this. We could maybe produce press releases that are designed to appeal to one tribe or the other, and then put them out at the same time — put two kinds of releases out for all the different languages that people speak with their tribes. You know, you really need some empathy here because people can’t change their minds when they are trapped in tribes that believe one way or the other. They can’t accept the evidence even when they want to, even when they know in their hearts that they are incorrect. The main thing he suggested was that when you are engaged in an argument and you want to share facts, try your best to only share information from sources the other person considers friendly to their tribe. Any link you share from a source they consider are friendly to the other to them, unfortunately, it might be rejected outright.
Dan Kahan: I think we should be cautious too about having as our objective changing Group X’s mind. For one thing, Group X is not sealed off in some kind of chamber where they are they’re not seeing what we’re doing. If you read an article that says, “Boy the views of conservatives are really bad for us. How do we change their minds?” you’ll never change their minds, because you’re treating them as the problem. It’s also wrong. Because I’m the problem. You’re the problem. We all do this. So the question shouldn’t be how can we change conservatives’ minds. It should be how can we restore the state of science communication environment so that it works on these issues in the same way it works on these other issues. That said, that’s a stake we have in common with the people who disagree with us about the facts. I gave the example with HPV vaccine — the idea that this is happening to me, it should scare me. And so there’s plenty of room for common cause on creating a science communication environment in which we can not only form views about what science knows but can be confident that we’re doing it in a way that ordinarily leads us to the truth. And I think that that way of framing it, and I don’t think it’s a frame, I think it’s the truth. This is the truth. The assault on the reason that we’re experiencing now is something we can cure by using our reason, that we are going to use the methods of science to understand these issues and prevent them from arising and then fixing them once they arise. So rather than just tell you what to do, I’d rather tell you how. And it’s by using the signature method disciplined observation and inference from science.
You must be logged in to post a comment.