How to Inoculate the Mind Against Infectious Ideas

In a sense, we’re currently living through two pandemics: one biological, and one psychological. From the rise of the most unqualified, anti-intellectual president ever elected to the proliferation of conspiracy theories, science denial, and political extremism, the effects of this “epidemic of unreason” could promise to be more far-reaching and destructive than COVID-19 itself. 

At the root of the problem is not simply that there are false and harmful ideas in circulation—although there are certainly plenty—but rather that we completely lack the epistemological grounding to prevent their continued spread. It’s our susceptibility to become infected with false, harmful, or delusional ideas of any variety, in other words, that is the true nature of the problem. 

In Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think, philosopher Andy Norman shows us that our collective defenses against the spread of bad ideas—our individual and cultural immune systems—are failing us, allowing for the proliferation of a host of destructive ideas to take hold in our minds. And because our immune systems are compromised, we can’t stop these ideas from spreading—at least not without the proper “mind vaccine” to inoculate us from these mental parasites.    

The idea of the existence of mental immune systems is an intriguing take on the problem, and one that I think is more than a mere metaphor. It seems that we really do have functioning mental immune systems, and when those systems are compromised, ideas can propagate themselves with the same alacrity as the deadliest pathogen. As Norman wrote:

“Bad ideas have all the properties of parasites. Minds host them, the ways bodies host bacteria. When bad ideas spread, they replicate—copies are created in other minds. An idea can even induce its host to infect other minds, just as the flu virus can induce an infection-spreading sneeze. Finally, bad ideas are harmful almost by definition.”

Thoughtful readers are unlikely to challenge Norman on these points. There is little question that the world is plagued by a host of bad ideas and that these ideas tend to spread by circumventing the mind’s normal apparatus for weeding them out. How else can one explain the widespread belief in a flat earth, or in the “Pizzagate” conspiracy theory, for example? 

At the root of the problem are two groups of ideas that Norman would refer to as “cognitive immune disruptors”: (1) ideas that lead to radical skepticism, and (2) ideas that lead to rigid dogmatism. These two psychological states correspond to a hyperactive mental immune system (auto-immunity) on the one hand and to an underactive or compromised mental immune system (immune deficiency) on the other. Either bad ideas are filtered out along with the good, or bad ideas are allowed in pretty much indiscriminately (which is why it is often the case that the same person will believe in not one but several outlandish ideas). 

The question is, why do the majority of people tend to gravitate to either of these two extremes? The answer, it turns out, lies in the history of epistemology (theory of knowledge). Norman notes how all thinkers have been forced to reconcile a deep paradox uncovered by Socrates millenia ago. Socrates noted how every belief is in need of justificatory reasons for its support, but that those very reasons, in turn, require their own justifications. But if every reason in support of a belief requires its own justification, you’re faced with the problem of an infinite regress of reasons. And if, as Socrates showed, all reasons can be called into question—that there are, in fact, no objectively certain basic beliefs—then this introduces the possibility that we are not truly justified in adopting any beliefs whatsoever.

The inevitable outcome of this line of thinking leads straight to Pyrrhonian skepticism, which teaches that—in the face of this infinite regress—the total suspension of judgment is the only way to achieve ataraxia, or tranquility of mind. 

The problem with this approach, however, is obvious: since we all have no choice but to live and believe in something, radical skepticism leads to the idea that any belief is just as good as any other; in other words, it leads to relativism. Without the burden of having to justify our beliefs—on the grounds that justifications are fruitless in the face of an infinite regress—there is no longer any possibility of dialogue or compromise between conflicting belief systems. 

You see the echoes of this type of thinking in the relativism of today, or in the general reluctance to challenge a person’s basic beliefs. While it’s true that people do have the right to believe whatever they want, they also have the responsibility to ensure that the direct and indirect effects of those beliefs do not create undue harm for others. When they do, it is perfectly legitimate to challenge those beliefs on logical grounds. 

In response to this radical skepticism, the ancient world became disabused with the project of philosophy (this mindset continues today), which was understandably viewed as an empty academic exercise that led nowhere. This, in part, launched the age of faith, where thinkers adopted the opposite extreme: unwavering commitment to the certainty of specific basic beliefs that no longer required justification (if all reasons require justification, and justification faces an infinite regress, you have to stop the regress somewhere). The faithful terminate the regress with the belief in God and the tenets of their particular religion. 

The problem with this, however, is similar to that of radical skepticism and the relativism it ultimately leads to. Whether a relativist or a dogmatic fundamentalist, you are equally depriving reason of its regulatory mechanism. Whether you assert that all beliefs are equally valid or that only your specific sets of beliefs represent the final truth, you remove the possibility of rational dialogue, compromise, and intellectual growth and learning. If reason loses its role as mediator between conflicting worldviews (as it does with religion), the only available tools of dispute resolution become violence or coercion, not persuasion. 

Norman’s prescription for a mind vaccine, then, is the adoption of the middle path between skepticism/relativism and dogmatism/faith. It relies on reason as the only possible (although imperfect) mediator between conflicting beliefs, and on the idea that we should all challenge our own beliefs with the same tenacity as we challenge others, stopping short of total skepticism toward basic beliefs that warrant our collective assent. 

But the real insight of the book is this: don’t expect that the teaching of critical thinking skills alone will produce better thinkers. Most people engage in what psychologists call “motivated reasoning,” whereby they start with a desired conclusion and then proceed to find evidence to fit their pre-existing beliefs. Teach someone prone to motivated reasoning critical thinking skills, and you create a more skilled propagandist, not a fair-minded thinker. In many cases, they will use their newfound skills to be selectively skeptical—i.e., they’ll subject beliefs they don’t like to critical scrutiny while sparing their own favored beliefs from the same examination. 

So, for example, if you teach someone that believes in conspiracy theories critical thinking skills, you run the risk of simply making them better at rationalizing ludicrous ideas. 

The upshot is this: In addition to teaching critical thinking skills, you must also teach fundamental dispositions like curiosity, intellectual humility, openness to experience, and the adoption of a growth mindset. The fact is, in order to become an effective and responsible thinker, it’s less important that you develop an ability to persuade others than the ability to be persuadable yourself, to change your beliefs in the face of new evidence or better reasons. 

For what it’s worth, I think Norman would have been better off spending more time emphasizing these points. Instead, he ends the book with this rather anticlimactic presentation of the long-awaited “mind vaccine” he promised his readers throughout the book: 

“A belief is reasonable if it can withstand the challenges to it that genuinely arise.”

Let’s unpack this a bit. The assertion that a challenge needs to “genuinely arise” is Norman’s attempt to prevent the problem of infinite regress. A skilled questioner, remember, can pose challenges to any belief and its associated reasons ad infinitum, as the ancients maintained. But, as it turns out, this is neither practical nor wise, as it leads one to abandon reason altogether. 

Here’s an example: The belief that the sun will rise tomorrow is, according to Norman’s criteria, a reasonable belief, because the belief fits all known past experience and all known laws of nature. A skeptical questioner, however, could point out that there is no reason to suppose that the future must necessarily resemble the past (the problem of induction), and that therefore there is no reason to suspect that the sun will rise tomorrow. Although this is technically true, this challenge does not render the original belief unreasonable on account of the fact that virtually everyone can agree that it is most certainly true (even if it falls short of complete certainty). 

While achieving certainty is impossible, knowledge does not require certainty, only that beliefs are reasonably supported. And when a proposition like “the sun will rise tomorrow” achieves such widespread acceptance and evidence in its favor, it’s no longer reasonable to challenge it (the challenge does not “genuinely arise”). If someone were to ask you, “Why do you think the sun will rise tomorrow?” the appropriate response would be, “Why do you doubt it?” 

In this case, it’s perfectly legitimate to shift the burden of disproof onto the skeptic, as certain basic beliefs should go without challenge. Legitimate challenges to a more controversial proposition, of course, must be addressed, and this is what keeps the spirit of critical inquiry alive, preventing lapses into dogmatism. 

The reader may wonder, however, whether or not Norman’s “mind vaccine” is as profound of a concept, or as useful against the spread of bad ideas, as he thinks it is. Consider that two people can agree that “a belief is reasonable if it can withstand the challenges to it that genuinely arise” but then vociferously disagree as to what counts as a basic belief immune to challenge in the first place. 

And this is precisely the source of the problem: What people take to be basic beliefs. Sure, everyone can agree to accept the conclusion that “the sun will rise tomorrow” in spite of the problem of induction and move on with their lives. But what about more complex and controversial issues where what counts as a basic belief is exactly the issue, like belief in God, for instance. 

Since Norman offers no criteria for determining which beliefs should be held as basic and which should not, his mind vaccine will do little to resolve the issue. In fact, one could use Norman’s mind vaccine to proclaim as basic whatever beliefs they want, therefore rendering them automatically reasonable.

In addition, I think it’s safe to say that many people already have sufficient critical reasoning skills, they simply choose to selectively apply them. Since Norman agrees with this, it’s surprising that his mind vaccine makes no mention of dispositions or intellectual virtues. 

Here, then, is my own proposal for a more effective mind vaccine:

“To become a better and more responsible thinker, overcome your tendency to engage in motivated reasoning by challenging your own beliefs with the same tenacity as you challenge others, and be willing to—and in fact take pride in your ability to—update your beliefs in the face of new evidence and better reasons.”

Since critical thinking is not an exact science, and reason does not lead to certainty, we cannot expect to discover any set rules for establishing reasonable beliefs. The best we can hope for is a world where people honestly and consistently challenge their own beliefs and strive, to the best of their ability, to proportion their beliefs to the evidence in rational dialogue with others.


Overall, this is a timely and important book that contains many deep insights that I could never hope to cover in a short post. Norman takes the reader through the history of the theory of knowledge, examines the ethics of belief, outlines his proposal for a new field of science (cognitive immunology), and much more, making this one of the better nonfiction reads of the year. 

Further Reading

In addition to Mental Immunity, check out Think Again: The Power of Knowing What You Don’t Know by psychologist Adam Grant, which covers a lot of the same points from a different perspective. Also check out my review of Think Again here: The Power of Rethinking: How to Beat the Overconfidence Effect in Yourself and Others