The cyberpunks, the Founding Fathers, the 19th Century philosophers, and the Enlightenment thinkers — they each envisioned a perfect democracy powered by a constant multimedia psychedelic freakout in which all information was free, decentralized, democratized, and easy to access.

In each era, the dream was the same: A public life for the average citizen that was no longer limited by any kind of information deficit; a life augmented by instant and full access to all the information anyone could ever want. On top of that, they imagined the end of gatekeepers, the public fully able to choose what went into their minds.

In his book on the history of human progress, Our Kind, anthropologist Marvin Harris asked in the final chapter, “Will nature’s experiment with mind and culture end in nuclear war?”

The book came out in 1989, in the final years of our Cold War nuclear paranoia, and his telling of how people developed from hunter gatherers all the way to McDonald’s franchise owners, he said, couldn’t honestly end with him gazing optimistically to the horizon because never had the fate of so many been under the control of so few.

“What alarms me most,” he wrote, “is the acquiescence of ordinary citizens and their elected officials to the idea that our kind has to learn to deal with the threat of mutual annihilation because it is the best way of reducing the danger that one nuclear power will attack another.”

In the final paragraph, Harris wrote that “we must recognize the degree to which we are not yet in control” of our own society. Progress was mostly chance and luck with human agency steering us away from the rocks when it could, but unless we gained some measure of control of where we were going as a species, he said, we’d be rolled over by our worst tendencies, magnified within institutions too complex for any one person to predict or direct.

If you ask a social scientist familiar with motivated reasoning and the backfire effect if there is any hope to ever reach people who refuse to accept facts – is there any chance to change people’s minds with evidence, reason, or scientific consensus – they will usually point you to a 2010 paper titled: “The Affective Tipping Point: Do Motivated Reasoners ever ‘Get It’?”

Like most of us, political scientists David P. Redlawsk, Andrew J.W. Civettini, and Karen M. Emmerson wondered if, when confronted with challenges to their erroneous beliefs, do the people who resist efforts at correction ever come around, or are we just causing more harm than good by trusting in facts instead of using some time-tested technique from the emotional manipulation toolkit?

To test this, Redlawsk and his team created a mock presidential election in which people would gradually learn more and more terrible things about their preferred virtual candidates from a virtual news media. Unbeknownst to the subjects, the news stories they read included a precise mix of negative information about their chosen candidates so the effect of those messages could be measured as the negativity increased in intensity.

The scientists thought that surely, at some point, after a person had chosen one candidate over another, a constant flow of negative information about that person would persuade them to reconsider their choices. They expected to see the backfire effect at first, of course, but they believed with enough persistence they might also discover its natural limit.

By now you’ve likely heard of confirmation bias. As a citizen of the internet the influence of this cognitive tendency is constant, and its allure is pervasive.

In short, when you have a hunch that you might already understand something, but don’t know for sure, you tend to go searching for information that will confirm your suspicions.

When you find that inevitable confirmation, satisfied you were correct all along, you stop searching. In some circles, the mental signal to end exploration once you feel like your position has sufficient external support is referred to as the wonderfully wordy “makes sense stopping rule” which basically states that once you believe you’ve made sense of something, you go about your business satisfied that you need not continue your efforts. In other words, just feeling correct is enough to stop your pursuit of new knowledge. We basically had to invent science to stop ourselves from trying to solve problems by thinking in this way.

We don’t treat all of our beliefs the same.

If you learn that the Great Wall of China isn’t the only man-made object visible from space, and that, in fact, it’s actually very difficult to see the Wall compared to other landmarks, you update your model of reality without much fuss. Some misconceptions we give up readily, replacing them with better information when alerted to our ignorance.

For others constructs though, for your most cherished beliefs about things like climate change or vaccines or Republicans, instead of changing your mind in the face of challenging evidence or compelling counterarguments, you resist. Not only do you fight belief change for some things and not others, but if you successfully deflect such attacks, your challenged beliefs then grow stronger.