Why do you try to drain the world of color when backed into a rhetorical corner?
Why do you have such a hard time realizing that you have suggested the world is devoid of nuance when you are in the heat of an argument, reducing all every wavelength to black and white, and all choices to A or B?
When confronted with dogma-threatening, worldview-menacing ideas, your knee-jerk response is usually to lash out and try to bat them away, but thanks to a nearly unavoidable mistake in reasoning, you often end up doing battle with arguments of your own creation.
Your lazy brain is always trying to make sense of the world on ever-simpler terms. Just as you wouldn’t use a topographical map to navigate your way to Wendy’s, you tend to navigate reality using a sort of Google Maps interpretation of events and ideas. It’s less accurate, sure, but much easier to understand when details aren’t a priority. But thanks to this heuristical habit, you sometimes create mental men of straw that stand in for the propositions put forth by people who see the world a bit differently than you. In addition to being easy to grasp, they are easy to knock down and hack apart, which wouldn’t be a problem if only you noticed the switcheroo.
If you have ever shared an opinion on the internet, you have probably been in an internet argument, and if you have been in enough internet arguments you have likely been called out for committing a logical fallacy, and if you’ve been called out on enough logical fallacies in enough internet arguments you may have spent some time learning how logical fallacies work, and if you have been in enough internet arguments after having learned how logical fallacies work then you have likely committed the fallacy fallacy.
How strong is your bullshit detector? And what exactly IS the scientific definition of bullshit?
In this episode we explore both of those concepts as well as what makes a person susceptible to bullshit, how to identify and defend against it, and what kind of people are the most and least likely to be bowled over by bullshit artists and other merchants of feel-good woo.
The problem with sorting out failures and successes is that failures are often muted, destroyed, or somehow removed from sight while successes are left behind, weighting your decisions and perceptions, tilting your view of the world.
That means to be successful you must learn how to seek out what is missing. You must learn what not to do. Unfortunately, survivorship bias stands between you and the epiphanies you seek.
To learn how to combat this pernicious bias, we explore the story of Abraham Wald and the Department of War Math founded during World War II.
Our guest in this episode of the You Are Not So Smart Podcast is psychologist Laurie Santos who heads the Comparative Cognition Laboratory at Yale University. In that lab, she and her colleagues are exploring the fact that when two species share a relative on the evolutionary family tree, not only do they share similar physical features, but they also share similar behaviors. Psychologists and other scientists have used animals to study humans for a very long time, but Santos and her colleagues have taken it a step further by choosing to focus on a closer relation, the capuchin monkey; that way they could investigate subtler, more complex aspects of human decision making – like cognitive biases.
You’ve likely wondered if the internet is having a negative effect on your brain. Perhaps you’ve thought this after realizing the world wide web now serves as a trusty resource when gaps in your knowledge appear, and over time it, you’ve thought, maybe it might be making you less knowledgeable overall because you habitually head to Google if you don’t know the answers to something, search, click, read a few lines, and then promptly forget the factoid until the next time you need it.