In this episode of the You Are Not So Smart Podcast, we sit down with four experts on human behavior to try and understand how wearing masks, during the COVID-19 pandemic, became politicized.

In the show, we take a take a deep dive into tribal psychology, which, in essence, says that humans are motivated reasoners who alter their thinking, feeling, and behaving when thinking, feeling, and behaving in certain ways might upset their peers.

At times, since belonging goals are so vital to our survival, we value signaling that we are good members of our tribes much more than we value being correct, and in those circumstances we will choose to be wrong — if signaling we believe wrong things seems like it will keep us in good standing with our peers.

In this episode, we sit down with Dr. Gleb Tsipursky, a behavioral economist and disaster avoidance expert who works with businesses and organizations to help them de-bias themselves.

In the show, we will go through a few of the most-common cognitive biases and learn how to de-bias ourselves within our personal lives to improve relationships and our overall wellbeing.

In 1835, at a tavern in Bavaria, a group of 120 people once met to drink from a randomized assortment of glass vials.

Before shuffling them, they divided the vials into two sets. One contained distilled water from a recent snowfall and the other a solution made by collecting 100 drops of that water and dropping into the pool a grain of salt, and then diluting a drop of the result into another 100 drops, again and again, 30 times in all.

They did this to test out a new idea in medicine called homeopathy, but it was the way they did it that changed things forever. By testing options A and B at the same time, but without telling sick people which option they would be getting, they not only debunked a questionable medical practice, they invented modern science and medicine.

About 200 years later a company in California tried something similar. A group of 700,000 people gathered inside a virtual tavern to share news and photos and stories both happy and sad. The company then used some trickery so that some people randomly encountered more happy things and others more sad things.

They did this to test out a new idea in networking called emotional contagion, but it was the way they did it that changed how many people felt about gathering online. By testing options A and B at the same time, but without telling people which option they would be getting, they not only learned if a computer program could make its users more happy or more sad, they created a backlash that resulted in a large-scale, world-wide panic.

Though we always learn something new when we perform an A/B test, we don’t always support the pursuit of that knowledge, which is strange, because without A/B testing we have to live with whatever option the world delivers to us, be it through chance or design. Should we use cancer drug A or B? Should we try gun control policy A or B? Should we try education technique A or B? It seems like our reaction to these questions would be to support testing A on half the people, B on the other, and then to look at which one works best and go with that moving forward, but as you will learn in this episode of the You Are Not So Smart Podcast, new research shows that a significant portion of the public does not feel this way, enough to cause doctors and lawmakers and educators to avoid A/B testing altogether.

Have you ever been in a classroom or a business meeting or a conference  and had a question or been confused by the presentation, and when the person running the show asked, “Does anyone have any questions?” or, “Does anyone not understand?” or, “Is anyone confused?” you looked around, saw no one else raising their hands, and then chose to pass on the opportunity to clear up your confusion?

If so, then, first of all, you are a normal, fully functioning human being with a normal, fully functioning brain, because not only is this common and predictable, there’s a psychological term for why most people don’t speak up in situations like these. It’s called pluralistic ignorance.

In this episode we sit down with Chris Clearfield, author of Meltdown: Why Our Systems Fail and What We Can Do About It.

He says about his book, “By understanding what lies behind these failures, we can design better systems, make our teams more productive, and transform how we make decisions at work and at home.”