YANSS 038 – How the Halo Effect Turns Uncertainty into False Certainty

The Topic: The Halo Effect

The Episode: DownloadiTunesStitcherRSSSoundcloud

halo

This episode is brought to you by Stamps.com – where is the fun in living in the future if you still have to go to the post office? Click on the microphone and enter “smart” for a $110 special offer.

This episode is also brought to you by Lynda, an easy and affordable way to help individuals and organizations learn. Try Lynda free for 7 days.

This episode is also brought to you by Harry’s. Get $5 off the perfect holiday gift. Just go to Harrys.com and type in my coupon code SOSMART with your first purchase of quality shaving products.

It’s difficult to be certain of much in life.

Not only are you mostly uncertain of what will happen tomorrow, or next year, or in five years, but you often can’t be certain of the correct course of action, the best place for dinner, what kind of person you should be, or whether or not you should quit your job or move to a new city. At best, you are only truly certain of a handful of things at any given time, and aside from mathematical proofs – two apples plus two apples equals four apples (and even that, in some circles, can be debated) – you’ve become accustomed to living a life in a fog of maybes.

Most of what we now know about the world replaced something that we thought we knew about the world, but it turned out we had no idea what we were talking about. This is especially true in science, our best tool for getting to the truth. It’s a constantly churning sea of uncertainty. Maybe this, maybe that – but definitely not this, unless… Nothing raises a scientist’s brow more than a pocket of certainty because it’s usually a sign that someone is very wrong.

Being certain is a metacognition, a thought concerning another thought, and the way we often bungle that process is not exclusively human. When an octopus reaches out for a scallop, she does so because somewhere in the chaos of her nervous system a level of certainty crossed some sort of threshold, a threshold that the rock next to the scallop did not. Thanks to that certainty threshold, most of the time she bites into food instead of gravel. We too take the world into our brains through our senses, and in that brain we too are mostly successful at determining the difference between things that are food and things that are not food, but not always. There’s even a Japanese game show where people compete to determine whether household objects are real or are facsimiles made of chocolate. Seriously, check out the YouTube video of a man gleefully biting off a hunk of edible door handle. Right up until he smiles, he’s just rolling the dice, uncertain.

chocolate doorknob

Thanks to the sciences of the mind and brain we now know of several frames in which we might make judgments about the world. Of course, we already knew about this sort of thing in the days of our prescientific stupor. You don’t need a caliper and some Bayesian analysis to know that the same person might choose a different path when angry than she would when calm or that a person in love is likely to make decisions she may regret once released from that spell. You have a decently accurate intuition about those states of mind thanks to your exposure to many examples over the years, but behavioral sciences have dug much deeper. There are frames of mind your brain works to mask from the conscious portions of the self. One such frame of mind is uncertainty.

In psychology, uncertainty was made famous by the work of Daniel Kahneman and Amos Tversky. In their 1982 collection of research, “Judgments under Uncertainty,” the psychologists explained that when you don’t have enough information to make a clear judgment, or when you are making a decision concerning something too complex to fully grasp, instead of backing off and admitting your ignorance, you tend to instead push forward with confidence. The stasis of uncertainty never slows you down because human brains come equipped with anti-uncertainty mechanisms called heuristics.

In their original research they described how, while driving in a literal fog, it becomes difficult to judge the distance between your car and the other cars on the road. Landmarks, especially those deep in the mists, become more hazardous because they seem farther away than they actually are. This, they wrote, is because for your whole life you’ve noticed that things that are very far away appear a bit blurrier than things that are near. A lifetime of dealing with distance has reinforced a simple rule in your head: the closer an object the greater its clarity. This blurriness heuristic is almost always true, except underwater or on a foggy morning or on an especially clear day when it becomes incorrect in the other direction causing objects that are far away to seem much closer than normal.

Kahneman and Tversky originally identified three heuristics: representativeness, availability, and anchoring. Each one seems to help you solve the likelihood of something being true or the odds that one choice is better than another, without actually doing the work required to truly solve those problems. Here is an example of representativeness from their research. Imagine I tell you that a group of 30 engineers and 70 lawyers have applied for a job. I show you a single application that reveals a person who is great at math and bad with people, a person who loves Star Wars and hates public speaking, and then I ask whether it is more likely that this person is an engineer or a lawyer. What is your initial, gut reaction? What seems like the right answer? Statistically speaking, it is more likely the applicant is a lawyer. But if you are like most people in their research, you ignored the odds when checking your gut. You tossed the numbers out the window. So what if there is a 70 percent chance this person is a lawyer? That doesn’t feel like the right answer.

That’s what a heuristic is, a simple rule that in the currency of mental processes trades accuracy for speed. A heuristic can lead to a bias, and your biases, though often correct and harmless, can be dangerous when in error, resulting in a wide variety of bad outcomes from foggy morning car crashes to unconscious prejudices in job interviews.

For me, the most fascinating aspect of all of this is how it renders invisible the uncertainty that leads to the application of the heuristic. You don’t say to yourself, “Hmm, I’m not quite sure whether I am right or wrong, so let’s go with lawyer.” or, “Hmm, I don’t know how far away that car is, so let’s wait a second to hit the brake.” You just react, decide, judge, choose, etc. and move on, sometimes right, sometimes wrong, unaware – unconsciously crossing your fingers and hoping for the best.

These processes lead to a wonderful panoply of psychological phenomena. In this episode of the podcast we explore the halo effect, one of the ways this masking of uncertainty can really get you in trouble. When faced with a set of complex information, you tend to turn the volume down on the things that are difficult to quantify and evaluate and instead focus on the few things (sometimes the one thing) that is most tangible and concrete. You then use the way you feel about what is more-salient to determine how you feel about the things that are less-salient, even if the other traits are unrelated.

Here’s an example. In a study headed by psychologist Barry Staw in 1974, 60 business school students gathered together into three-person groups. Each group received the financial reports of a mid-sized company full of hard data for five years and a letter from the company’s president describing its prospects. The report was from 1969, the task for each group was to estimate the sales and earnings per share for that company in 1970. Since they had 1970’s data on hand, it would be a good exercise to see how much the students had learned in business school. The scientists told the business students that they had already run this experiment once on groups of five people, and that they wanted to see how smaller groups would perform on the same task. Of course, most of this wasn’t true. No matter what the students turned in, the scientists tossed it all out. Instead, each group received a randomly assigned grade. Some were told they did extremely well, and others were told they did very, very poorly.

What Staw discovered was that when the students were told they performed in the top 20 percent of all subjects, the people in the groups attributed that success to things like great communication, overall cohesiveness, openness to change, competence, a lack of conflict, and so on. In groups told that they performed in the bottom 20 percent, the story was just the opposite. They said they performed poorly because of a lack of communication, differences in ability, close-mindedness, sparks of conflict, and a variety of other confounding variables. They believed they had gained knowledge about the hazy characteristics of the group, but in reality they were simply using a measure of performance as a guide for creating attributions from thin air

In his book, “The Halo Effect,” Phil Rosenzweig described the Staw study like this, “…it’s hard to know in objective terms exactly what constitutes good communication or optimal cohesion…so people tend to make attributions based on other data that they believe are reliable.” That’s how the halo effect works – things like communication skills are weird, nebulous, abstract, and nuanced concepts that don’t translate well into quantifiable, concrete, and measurable aspects of reality. When you make a judgment under uncertainty your brain uses a heuristic and then covers up the evidence so that you never notice that you had no idea what you were doing. When asked to rate their communication skills, a not-so-salient trait, they looked for something more salient to go on. In this case it was the randomly assigned rating. That rating then became a halo whose light altered the way the students saw all the less-salient aspects of their experiences. The only problem was that the rating was a lie, and thus, so was each assessment.

Research into the halo effect suggests this sort of thing happens all the time. In one study a professor had a thick, Belgian accent. If that professor pretended to be mean and strict, American students said his accent was grating and horrendous. If he pretended to be nice and laid-back, similar students said his accent was beautiful and pleasant. In another study scientists wrote an essay and attached one of two photos to it, pretending that the photos were of the person who wrote the work. If the photo was of an attractive woman, people tended to rate the essay as being well-written and deep. If the photo was that of (according to the scientists) an unattractive woman, the essay received poorer scores and people tended to rate as being less insightful. In studies where teachers were told that a student had a learning disability they rated that student’s performance as weaker than did other teachers who were told nothing at all about the student before the assessment began. In each example, people didn’t realize they were using a small, chewable bite of reality to make assumptions about a smorgasbord they couldn’t fully digest.

As an anti-uncertainty mechanism, the halo effect doesn’t just render invisible your lack of insight, but it encourages you to go a step further. It turns uncertainty into false certainty. And, sure, philosophically speaking, just about all certainty is false certainty, but research into the halo effect suggests that whether or not you accept this, as a concept, as a truth – you rarely notice it in the moment when it actually matters.

Fire up the latest episode of the You Are Not So Smart Podcast to learn more about the halo effect, and as an added bonus you’ll hear an additional two-and-a-half hours of excerpts from my book, You Are Now Less Dumb, which is now available in paperback.

Links and Sources

DownloadiTunesStitcherRSSSoundcloud

Previous Episodes

Boing Boing Podcasts

Cookie Recipes

Business School Study

Judgment Under Uncertainty

You Are Now Less Dumb

The Halo Effect

The Halo Effect – Book

YANSS 037 – Drive, Motivation, and Crowd Control with Daniel Pink

The Topic: Motivation

The Guest: Daniel Pink

The Episode: DownloadiTunesStitcherRSSSoundcloud

A scene from Office Space - 20th Century Fox

A scene from Office Space – 20th Century Fox

This episode is brought to you by Lynda, an easy and affordable way to help individuals and organizations learn. Try Lynda free for 7 days.

This episode also brought to you by Squarespace. For a free trial and 10% off enter offer code LESSDUMB at checkout.

Why do you work where you work? I mean, specifically, why do you do whatever it is that you do for a living?

I’m pretty sure that you can answer this question. The average person, according to the Bureau of Labor Statistics, spends between 11 and 15 years of his or her life at work. On the high end, that’s about a fifth of your time on Earth as a person capable of enjoying pumpkin pie and movies about robots. That’s a lot of time spent doing something for reasons unknown, so I doubt you would lift your shoulders and offer up open palms of confusion when it comes to this question. I’m just not so sure that the answer you come up with will be correct.

You probably know all about intrinsic versus extrinsic rewards and the other behavioral motivations like your basic drives for food, sex, and social acceptance as well as the pursuit of pleasure over pain and the quest for your other emotional needs. You know that intrinsic rewards satisfy these desires directly, while extrinsic rewards are usually tokens you can later trade for satisfaction. So, knowing all of this, it’s likely very easy for you to explain your motivations for attending all those meetings and answering all those emails before putting on all those shoes after shaving all the those places before commuting all those miles. Still, I’m not sure I believe you.

Two of my favorite studies in psychology illustrate why I’m a bit skeptical about your justification for your actions – the story you tell yourself and others when wondering why you do what you do.

In one experiment, Leon Festinger and his colleagues brought college students into a room where those students, one at a time, sat across from a scientist in a lab coat who took notes while holding a stopwatch. The researcher asked these students to place wooden spools on a serving tray until it could hold no more, and then take them all back off again. After a while, the same people moved on to a second task in which they rotated a wooden peg round and round a quarter-turn at a time. Altogether, the students spent one incredibly boring hour doing mindless tasks, and after it was all over the scientist asked if the student in each run of the experiment would, before leaving, tell the next person waiting to do those same tasks that the experiment overall was fun and interesting. Every student did, and then, as a final task, each student was asked to write a brief essay explaining how he truly felt. The students didn’t know they had been divided into two groups. Some students had received the equivalent of about $8 before the experiment began, and the others received what would be about $150 in today’s money. Even though every other part of the experiment was identical, the difference in pay completely changed what the students wrote in those essays. The $150 group said the experiment was awful and tedious, something they would rather not do again. The $8 group said it was actually kind of neat, meditative and relaxing, and kind of fun when you think about it. Why the difference? Festinger said the two groups looked back on their actions and felt icky about lying. There was no congruence between what they had done and what they had said to the stranger, so to come into congruence they needed some sort of justification they could plug into their narratives. One group had $150 as justification, and so they were free to be honest with themselves. They did it for the money. They lied. The task was terrible. The other group didn’t have such an easy way out of those bad feelings, so they reframed the experience. I wouldn’t lie for a measly $8, each one thought. It was sort of pleasant really, so actually I didn’t lie after all. Two realities formed in two groups of people, and the only difference was how much compensation they received.

The other experiment was conducted by Mark Lepper, Daniel Greene and Richard Nisbett. They went to a preschool and observed children playing during free time. They noted which children tended to be most interested in art supplies – drawing, coloring, and painting – and then divided those children into three groups. Group A was told that over the next three days every time they chose an art activity during free time they would receive a heap of praise and a certificate of achievement. Group B wasn’t told this ahead of time, but when they chose to draw, paint, or color, they were surprised with the award and the praise just like group A. Group C was allowed to just keep playing as usual, no rewards. After the observation the scientists waited three weeks and then returned to measure how often each of the children in the three groups were now choosing art activities on his or her own. Upon return, they found that groups B and C were no different than before, but children in Group A were now significantly less likely to play with the art supplies than they were before the experiment began. The researchers explained that even though the activities were exactly the same for all three groups both before and after, only group A had reframed the experience to now be about rewards. They saw themselves as painting for praise, drawing for the sake of a payment. It was work. Even with those incentives no longer in place, the story some preschoolers told themselves had been tainted while the story for the children in the other groups had not.

Psychologists call these two phenomenon insufficient justification and overjustification, two extremes on the spectrum of internal storytelling. In one scenario a lack of an extrinsic reward, cash for lying, led to the invention of an intrinsic one that rewrote the entire experience. In the other scenario, a new way of looking at a beloved activity robbed children of an intrinsic reward, the joy of creation, and replaced it with an extrinsic one, pay for play. In both experiments, the brains of the people involved adopted new behaviors and perspectives without them knowing it.

That’s why I’m not sure you know why you do the work that you do. Rewards, both intrinsic and extrinsic, can scramble our narratives and justifications, and so the stories we tell ourselves can become weird fictions that keep us going, not that this is a bad thing. It’s just that we tend to believe we have access to the motivations behind our actions, and we tend to believe we know the source of our emotions and drives, but the truth is that we often do not have access to this information despite how easy it seems to come up with rational explanations as if we did.

This presents a problem for employers who want to build better workplaces and employees who want to enjoy their 11 to 15 years of life working in those workplaces. If people don’t know what drives them, and employers don’t know how to incentivize people to be more engaged, and overall we have a terrible grasp of how to be fulfilled and happy in our work, yet everyone kind of thinks they know what they are doing even though they don’t, then what should we be doing instead? Well, the good news is that this whole system of rewards, incentives, motivations, and related phenomena has been studied for long enough that psychology and neuroscience have some practical, actionable advice for workplaces and individuals when it comes to harnessing our motivations and drives.

Dan PinkOur guest in this episode of the You Are Not So Smart podcast is Daniel Pink, author of the book “Drive” and the host of the new National Geographic show “Crowd Control.” In Drive, Pink writes about how many businesses and institutions depend on folklore instead of science to encourage people to come to work and be creative. He explains that the greatest incentives, once people are paid a decent wage, are autonomy, mastery, and purpose – intrinsic rewards that workplaces can easily offer if they choose to change the way they incentivize employees. In Crowd Control, Pink explores how, by paying attention to what science tells us truly motivates people, we can change the way we do things from giving out speeding tickets to managing baggage claims at airports so that we alter people’s behavior for the benefit of everyone. In the interview Pink details what he’s learned from both projects when it comes to what truly motivates us.

After the interview, I discuss a news story about delayed acting out to changes in the workplace.

In every episode, before I read a bit of self delusion news, I taste a cookie baked from a recipe sent in by a listener/reader. That listener/reader wins a signed copy of my new book, “You Are Now Less Dumb,” and I post the recipe on the YANSS Pinterest page. This episode’s winner is Marshall Schott who submitted a recipe for pumpkin pie snickerdoodles. Send your own recipes to david {at} youarenotsosmart.com.

pumpkincookieLinks and Sources

DownloadiTunesStitcherRSSSoundcloud

Previous Episodes

Boing Boing Podcasts

Cookie Recipes

Daniel Pink

Crowd Control

Office stress? Workers may wait before acting out

Tony Robbins

Les Brown

Nick Vujicic

Susie Wolff

Zig Ziglar

Matt Foley

About these ads

YANSS Podcast 036 – Why We Are Unaware that We Lack the Skill to Tell How Unskilled and Unaware We Are

The Topic: The Dunning-Kruger Effect

The Guest: David Dunning

The Episode: DownloadiTunesStitcherRSSSoundcloud

A scene from NBC's "The Office"

A scene from NBC’s “The Office”

This episode is brought to you by Stamps.com – where is the fun in living in the future if you still have to go to the post office? Click on the microphone and enter “smart” for a $110 special offer.

This episode is also brought to you by Lynda, an easy and affordable way to help individuals and organizations learn. Try Lynda free for 7 days.

Here’s a fun word to add to your vocabulary: nescience. I ran across it a few months back and kind of fell in love with it.

It’s related to the word prescience, which is a kind of knowing. Prescience is a state of mind, an awareness, that grants you knowledge of the future – about something that has yet to happen or is not yet in existence. It’s a strange idea isn’t it, that knowledge is a thing, a possession, that it stands alone and in proxy for something else out there in reality that has yet to actually…be? Then, the time comes, and the knowledge is no longer alone. Foreknowledge becomes knowledge and now corresponds to a real thing that is true. It is no longer pre-science but just science.

I first learned the word nescience from the book Ignorance and Surprise by Matthias Gross. That book revealed to me that, philosophically speaking, ignorance is a complicated matter. You can describe it in many ways. In that book Gross talks about the difficulties of translating a sociologist named Georg Simmel who often used the word “nichtwissen” in his writing. Gross says that some translations changed that word to nescience and some just replaced it with “not knowing.” It’s a difficult term to translate, he explains, because it can mean a few different things. If you stick to the Latin ins and outs of the word, nescience means non-knowledge, or what we would probably just call ignorance. But Gross writes that in some circles it has a special meaning. He says it can mean something you can’t know in advance, or an unknown unknown, or something that no human being can ever hope to know, something a theologian might express as a thought in the mind of God. For some people, as Gross points out, everything is in the mind of God, so therefore nothing is actually knowable. To those people nescience is the natural state of all creatures and nothing can ever truly be known, not for sure. Like I said, ignorance is a complex concept.

It’s that last meaning of nescience that I think is most fun. Take away the religious aspect and nescience is prescience in negative. It is the state of not knowing, but stronger than that. It’s not knowing something that can’t be known. It’s not even knowing that you can’t know it. For instance, your cat can never read or understand the latest terms and conditions for iTunes, thus if she clicked on “I Agree,” we wouldn’t consider that binding. There are vast expanses of ignorance that your cat can’t even imagine, much less gain the knowledge about those things required to rid herself of that ignorance. That’s the definition of nescience I prefer.

I love this word, because once you accept this definition you start to wonder about a few things. Are there some things that, just like my cat, I can never know that I can never know? Are there things that maybe no one can ever know that no one can ever know? It’s a fun, frustrating, dorm-room-bong-hit-whoa-dude loop of weirdness that real philosophers and sociologists seriously ponder and continue to write about in books you can buy on Amazon.

I think I like this idea because I often look back at my former self and imagine what sort of advice I would offer that person. It seems like I’m always in a position to do that, no matter how old I am or how old the former me is in my imagination. I was always more ignorant than I am now, even though I didn’t feel all that ignorant then. That means that it’s probably also true that right now I’m sitting here in a state of total ignorance concerning things that my future self wishes he could shout back at me through time. Yet here I sit, unaware. Nescient.

The evidence gathered so far by psychologists and neuroscientists seems to suggest that each one of us has a relationship with our own ignorance, a dishonest, complicated relationship, and that dishonesty keeps us sane, happy, and willing to get out of bed in the morning. Part of that ignorance is a blind spot we each possess that obscures both our competence and incompetence.

Psychologists David Dunning and Joyce Ehrlinger once conducted an experiment investigating how bad people are at judging their own competence. Specifically, they were interested in people’s self-assessment of a single performance. They wrote in the study that they already knew from previous research that people seemed to be especially prone to making mistakes when they judged the accuracy of their own perceptions if those perceptions were of themselves and not others. To investigate why, they created a ruse.

In the study, Dunning and Ehrlinger describe how they gathered college students together who agreed to take a test. All the participants took the exact same test – same font, same order, same words, everything – but the scientists told one group that it was a test that measured abstract reasoning ability. They told another group it measured computer programming ability. Two groups of people took the same exam, but each batch of subjects believed it was measuring something unique to that group. When asked to evaluate their own performances, the people who believed they had taken a test that measured reasoning skills reported back that they felt they did really well. The other group, however, the ones who believed they had taken a test that measured computer programming prowess, weren’t so sure. They guessed that they did much poorer on the test than did the other group – even though they took the same test. The real results actually showed both groups did about the same. The only difference was how they judged their own performances. The scientists said that it seemed as though the subjects weren’t truly judging how well they had done based on any ease or difficulty they may have experienced during the test itself, but they were inferring how well they had performed based on the kind of people they believed themselves to be.

Dunning and Ehrlinger knew that most college students tend to hold very high opinions of themselves when it comes to abstract reasoning. It’s part of what they call a “chronic self view.” You have an idea of who you are in your mind, and it is kind of like a character in a story, the protagonist in the tale of your life. Some aspects of that character are chronic, traits that are always there that you feel are essential and evident, beliefs about your level of skill that are consistent across all situations. For most college students, being great at abstract reasoning is one of those traits, but being great at computer programming is not.

Dunning and Ehrlinger write that the way you view your past performances can greatly affect your future decisions, behaviors, judgments, and choices. They bring up the example of a first date. How you judge your contribution to the experience might motivate you to keep calling someone who doesn’t want to ever see you again, or it might cause you to miss out on something wonderful because you mistakenly think the other person hated every minute of the night. In every aspect of our lives, they write, we are evaluating how well we performed and using that analysis to decide when to continue and when to quit, when to try harder and work longer and when we can sit back and rest because everything is going just fine. Yet, the problem with this is that we are really, really bad at this kind of analysis. We are nescient. The reality of our own abilities, the level of our own skills, both when lacking and when excelling, is often something we don’t know that we don’t know.

Dunning and Ehrlinger put it like this, “In general, the perceptions people hold, of either their overall ability or specific performance, tend to be correlated only modestly with their actual performance.” We must manage our own ignorance when reflecting on any performance – a test, an athletic event, a speech, or even a conversation. Whether modest or confident, you often depend on the image you maintain of yourself as a guide for how well you did more than actual feedback. To make matters worse, you often don’t get any feedback, or you get a bad version of it.

In the case of singing, you might get all the way to an audition on X-Factor on national television before someone finally provides you with an accurate appraisal. Dunning says that the shock that some people feel when Simon Cowell cruelly explains to them that they suck is often the result of living for years in an environment filled with mediocrity enablers. Friends and family, peers and coworkers, they don’t want to be mean or impolite. They encourage you to keep going until you end up in front of millions reeling from your first experience with honest feedback.

David DunningWhen you are unskilled yet unaware, you often experience what is now known in psychology as the Dunning-Kruger effect, a psychological phenomenon that arises sometimes in your life because you are generally very bad at self-assessment. If you have ever been confronted with the fact that you were in over your head, or that you had no idea what you were doing, or that you thought you were more skilled at something than you actually were – then you may have experienced this effect. It is very easy to be both unskilled and unaware of it, and in this episode we explore why that is with professor David Dunning, one of the researchers who coined the term and a scientist who continues to add to our understanding of the phenomenon.

Read more about the Dunning-Kruger effect from David Dunning himself in this article recently published in the Pacific Standard.

After the interview, I discuss a news story about how people overestimate how awesome they look when bragging and underestimate how much people hate hearing you toot your own horn.

In every episode, before I read a bit of self delusion news, I taste a cookie baked from a recipe sent in by a listener/reader. That listener/reader wins a signed copy of my new book, “You Are Now Less Dumb,” and I post the recipe on the YANSS Pinterest page. This episode’s winner is Janelle Robichaud who submitted a recipe for sunshine cookies. Send your own recipes to david {at} youarenotsosmart.com.

Sunshine CookiesLinks and Sources

DownloadiTunesStitcherRSSSoundcloud

Previous Episodes

Boing Boing Podcasts

Cookie Recipes

David Dunning

We Are All Confident Idiots

Scientific Evidence That Self-Promoters Underestimate How Annoying They Are

20 Minutes of X-Factor Auditions

Ignorance and Surprise

YANSS Podcast 035 – Sunk Costs and the Pain of Vain

The Topic: The Sunk Cost Fallacy

The Episode: DownloadiTunesStitcherRSSSoundcloud

BD

This episode is brought to you by Lynda, an easy and affordable way to help individuals and organizations learn. Try Lynda free for 7 days.

Every once in a while you will ask yourself, “I wonder if I should quit?”

Should you quit your job? Should you end your relationship? Should you abandon your degree? Should you shut down this project?

These are difficult questions to answer. If you are like me, every time you’ve heard one of those questions emerge in your mind, it lingered. It began to echo right as you woke up and just as pulled the covers over your shoulders. In the shower, waiting in line, in all your quiet moments – a question like that will appear behind your eyes, pulsating like a giant neon billboard until you can work out your decision.

Oddly enough, as a human being, that decision is often not made any easier when quitting is the most logical course of action. Even if it is obvious that it is no longer worth your time to keep going, your desire to plod on and your reluctance to quit are both muddled by an argumentative loop inside which you and many others easily get stuck.

The same psychological hooks that cost companies millions of dollars to produce products obviously destined to fail can also keep troops in harm’s way long past the point when the whole war effort should be brought to an end. It’s a universal human tendency, the same one that influences you to keep watching a bad movie instead of walking out of the theater in time to catch another or that keeps you planted in your seat at a restaurant after you’ve been waiting thirty minutes for your drinks. If you reach the end of the quest, you think, then you haven’t truly lost anything, and that is sometimes a motivation so strong it prolongs horrific, bloody wars and enormously expensive projects well past the point when most people involved in efforts like those have felt a strong intuition that no matter the outcome, at this point, total losses will exceed any potential gains.

In this episode of the You Are Not So Smart Podcast, we explore the sunk cost fallacy, a strangely twisted bit of logic that seems to pop into the human mind once a person has experienced the pain of loss or the ickiness of waste on his or her way toward a concrete goal. It’s illogical, irrational, unreasonable – and as a perfectly normal human being, you act under its influence all the time.

LINKS

DownloadiTunesStitcherRSSSoundcloud

Previous Episodes

Boing Boing Podcasts

Cookie Recipes

Liberal or conservative? Brain responses to disgusting images help reveal political leanings

The Genetic Fallacy

More on The Genetic Fallacy

SOURCES

  • Ariely, D. (2009). Predictably irrational, revised and expanded edition: The hidden forces that shape our decisions. Harper. (Amazon link)
  • Arkes, Hal R., and Peter Ayton. “The Sunk Cost and Concorde Effects: Are Humans Less Rational than Lower Animals?” Psychological Bulletin 125.5 (1999): 591-600. Print. (pdf)
  • Burthold, G. R. (2008). Psychology of decision making in legal, health care and science settings. Gardners Books. (Google Books link)
  • Busch, Jack. “Travel Zen: How to Avoid Making Your Vacation Seem Like Work.” Primer Magazine. Primer Magazine, Jan. 2009. Web. Mar. 2011. (link)
  • Gaming Can Make a Better World. By Jane McGonigal. TED Talks. TED Conferences, LLC, Feb. 2010. Web. Mar. 2011. (link)
  • Godin, Seth. “Ignore Sunk Costs.” Seth’s Blog. Typepad, Inc., 12 May 2009. Web. Mar. 2011. (link)
  • Höffler, Felix. “Why Humans Care About Sunk Costs While (Lower) Animals Don’t.” The Max Planck Institute for Research on Collective Goods, 31 Mar. 2008. Web. Mar. 2011. (pdf)
  • Indvik, Lauren. “FarmVille” Interruption Cited in Baby’s Murder.” Mashable. Mashable Inc., 28 Oct. 2010. Web. Mar. 2011. (link)
  • Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux. (Amazon link)
  • Kushner, David. “Games: Why Zynga’s Success Makes Game Designers Gloomy.” Wired. Conde Nast Digital, 27 Sept. 2010. Web. Mar. 2011. (link)
  • Lehrer, Jonah. “Loss Aversion.” ScienceBlogs. ScienceBlogs LLC, 10 Feb. 2010. Web. Mar. 2011. (link)
  • Schwartz, Barry. “The Sunk-Cost Fallacy Bush Falls Victim to a Bad New Argument for the Iraq War.” Slate. The Slate Group, 09 Sept. 2005. Web. Mar. 2011. (link)
  • Shambora, Jessica. “‘FarmVille’ Gamemaker Zynga Sees Dollar Signs.” CNN Money. Cable News Network, 26 Oct. 2009. Web. Mar. 2011. (link)
  • Vidyarthi, Neil. “City Council Member Booted For Playing Farmville.” SocialTimes. Web Media Brands Inc., 30 Mar. 2010. Web. Mar. 2011. (link)
  • Walker, Tim. “Welcome to FarmVille: Population 80 Million.” Independent. Independent Digital News and Media, 22 Feb. 2010. Web. Mar. 2011.
  • “Why Zynga’s Success Makes Game Designers Gloomy | Discussion at Hacker News.” Hacker News. Y Combinator, 7 Oct. 2010. Web. Mar. 2011 (link)
  • Wittmershaus, Eric. “Facebook Game’s Cautionary Tale.” GameWit. Press Democrat Media Co., 04 Aug. 2010. Web. Mar. 2011. (link)
  • Yang, Sizhao Zao. “How Did FarmVille Take over FarmTown, When It Was Just a Exact Duplicate of FarmTown and FarmTown Was Released Much Earlier?” Quora. Quora, Inc., 01 Jan. 2011. Web. Mar. 2011. (link)

YANSS Podcast 034 – After This, Therefore Because of This: Your Weird Relationship with Cause and Effect

The Topic: The Post Hoc Fallacy

The Episode: DownloadiTunesStitcherRSSSoundcloud

Screen Shot 2014-10-14 at 3.02.37 PM

Screen Shot 2014-10-14 at 3.02.37 PM

This episode brought to you by Squarespace. For a free trial and 10% off enter offer code LESSDUMB at checkout.

And by Lynda, an easy and affordable way to help individuals and organizations learn. Try Lynda free for 7 days.

When I was a boy, I spent my summers with my grandparents. They, like many Southerners, had a farm populated with animals to eat and animals to help. It was everywhere alive with edible plants – fields of corn and cucumbers and peas and butterbeans and peanuts, and throngs of mysterious life from stumps claimed by beds of ants to mushroom fairy rings, living things tending to business without our influence.

Remembering it now, I can see the symmetry of the rows, and the order of the barns, the arrangement of tools, the stockpiles of feed. I remember the care my grandmother took with tomatoes, nudging them along from the soil to the Ball jars she boiled, sealing up the red, seedy swirls under lids surrounded by brass-colored shrink bands. I remember my grandfather erecting dried and gutted gourds on polls so Martins would come and create families above us and we wouldn’t suffer as many mosquito bites when shelling peas under the giant pecan tree we all used for shade.

For me, the wonder of that life, even then, was in how so much was understood about cause and effect, about what was to come if you prepared, took care, made a particular kind of effort. It was as if they borrowed the momentum of the natural world instead of trying to force it one way or the other, like grabbing a passing trolley and hoisting yourself on the back.

Continue reading

YANSS Podcast 033 – The psychology of forming, keeping, and sometimes changing our beliefs

The Topic: Belief

The Guests: Will Storr, Margaret Maitland, and Jim Alcock

The Episode: DownloadiTunesStitcherRSSSoundcloud

Pizza Hut Pyramids

This episode brought to you by Squarespace. For a free trial and 10% off enter offer code LESSDUMB at checkout.

And by The Great Courses. Order Behavioral Economics and get 80% off the original price.

Put your right hand on your head. Unless you are near a mirror, you can no longer see your hand, but you know where it is, right? You know what position it is in. You know how far away it is from most of the other things around you. I’m using the word “know,” but that’s just for convenience, because you don’t actually know those things. That is, you can’t be 100 percent certain your hand is on your head. You assume it is, and that’s as good as it is going to get – a best guess. We’ll come back to that. You can put your hand down now.

Continue reading

YANSS Podcast 032 – Seeing willpower as powered by a battery that must be recharged

The Topic: Ego Depletion

The Episode: DownloadiTunesStitcherRSSSoundcloud

Stains the dog abstains from cupcakes on "It's Me or The Dog" on Animal Planet

Stains the dog abstains from cupcakes on “It’s Me or The Dog” on Animal Planet

One of my favorite tropes in fiction is the idea of the perfect thinker – the person who has shed all the baggage of being an emotional human being and could enjoy the freedom and glory of pure logic, if only he or she could feel joy.

Spock, Data, Seven of Nine, Sherlock Holmes, Mordin Solus, Austin James, The T-1000 – there are so many variations of the idea. In each fictional world, these beings accomplish amazing feats thanks to possessing cold reason devoid of all those squishy feelings. Not being very good at telling jokes or hanging out at parties are among their only weaknesses.

It’s a nice fantasy, to imagine without emotions one could become super-rational and thus achieve things other people could not. It suggests that we often see emotion as a weakness, that many people wish they could be more Spockish. But the work of neuroscientists like Antonio Damasio suggests that such a thing would be a nightmare. In his book, “Decarte’s Error” he describes patients who, because of an accident or a disorder, are no longer able to feel silly or annoyed or hateful or anything else. If they can, those feelings just graze them, never taking hold. Damasio explains that these patients, emotionally barren, are rendered powerless to choose a path in life. They can’t ascribe value to anything. Their world is flat. Despite remaining very intelligent and able to carry on conversations, they no longer make good decisions. Former business owners will lose all their money on bad investments. People who used to work from home will become lost in constantly reorganizing their shelves. Not only are their decisions flawed, but reaching conclusions becomes an excruciating process. When Damasio handed one of these patients two pens, one red and one blue, and asked him to fill out a questionnaire, the man was lost. To choose red over blue using logic alone took about half an hour. Every pro and con was listed, every branching possibility of future outcomes considered. Damasio wrote that “when emotion is entirely left out of the reasoning picture, as happens in certain neurological conditions, reason turns out to be even more flawed than when emotion plays bad tricks on our decisions.” Judgments and decisions corrupted by bias and passion are the only way we ever get anything done.

Continue reading

YANSS Podcast 031 – Why do you sabotage yourself when trying to break bad habits?

The Topic: Extinction Bursts

The Episode: DownloadiTunesStitcherRSS – Soundcloud

Illustration by Corie Howell - Source: http://bit.ly/1C6zTKU

Illustration by Corie Howell – Source: http://bit.ly/1C6zTKU

Why do you so often fail at removing bad habits from your life?

You try to diet, to exercise, to stop smoking, to stop staying up until 2 a.m. stuck in a hamster wheel of internet diversions, and right when you seem to be doing well, right when it seems like your bad habit is dead, you lose control. It seems all too easy for one transgression, one tiny cheating bite of pizza or puff of smoke, and then it’s all over. You binge, calm down, and the habit returns, reanimated and stronger than ever.

You ask yourself, how is it possible I can be so good at so many things, so clever in so many ways, and still fail at outsmarting my own vice-ridden brain? The answer has to do with conditioning, classical like Pavlov and operant like Skinner, and a psychological phenomenon that’s waiting in the future for every person who tries to twist shut the spigot of reward and pleasure – the extinction burst, and in this episode we explore how it works, why it happens, and how you can overcome it.

Continue reading

YANSS Podcast 030 – How practice changes the brain and exceptions to the 10,000 hour rule with David Epstein

The Topic: Practice

The Guest: David Epstein

The Episode: DownloadiTunesStitcherRSSSoundcloud

Photo by Glenda S. Lynchard - Source: http://bit.ly/1rmH627

Photo by Glenda S. Lynchard – Source: http://bit.ly/1rmH627

You live in the past.

You don’t know this because your brain lies to you and then covers up the lies, which is a good thing. If your brain didn’t fudge reality, you wouldn’t be able to hit a baseball, drive a car, or even carry on a conversation.

You may have already noticed this through its absence. Sounds that come from very far away don’t get edited. Maybe you’ve been high in the bleachers at a sporting event and saw the crack of a bat or the crunch of a tackle, but the sound seemed to arrive in your head just a tiny bit later than when it should have. Sometimes there is a delay, like reality is out of sync. You can see this in videos too. If you see a big explosion or a gun shot from far away, the sound will arrive after the camera has already recorded the images so that there is gap between seeing the boom and hearing it.

The reason this occurs, of course, is because sound waves travel much more slowly than light waves. But if that’s true, why isn’t there always a lag between seeing and hearing? How come you can carry on a conversation with someone at the end of a long hallway even though the light that’s allowing you to see her mouth is arriving well before the sound of her voice?

Continue reading

You Are Now Less Dumb now out in paperback!

fbbookHere are just a few of the hundreds of new ideas you’ll stuff in your head while reading You Are Now Less Dumb:

*You’ll finally understand why people wait in line to walk into unlocked rooms and how that reveals a universal behavior that slows progress and social change.

*You’ll discover the connection between salads, football, and consciousness.

*You’ll learn why people who die and come back tend to return with similar stories, and you’ll see how the explanation can help you avoid arguments on the internet.

* You’ll see why Bill Clinton, Gerard Butler, and Robert DeNiro all believe in the same magical amulet because they are all equally ignorant in one very silly way that you can easily avoid.

* You’ll learn about a scientist’s bizarre experiment that tested what would happen if multiple messiahs lived together for several years and how you can use what he learned to debunk your own delusions.

*You’ll learn why the same person’s accent can be irritating in some situations and charming in others and you can use that knowledge to make better hiring choices and improve education.

LINKS TO BUY

Amazon IB –  B&N – BAM Powell’siTunes – Audible – Google

EXCERPTS

TRAILERS

Screen Shot 2013-07-29 at 9.51.41 PM

Screen Shot 2013-07-29 at 9.51.41 PM

Screen Shot 2013-07-29 at 9.51.41 PM

THE STORY BEHIND THE GOOSE TREES

Before I explain where the idea came from, I’d like to endorse the people who did the hardest work. If you need a video, please contact Plus3. They made the trailers above, and they are great to work with. You can visit their website at http://www.plus3video.com.

Continue reading