Think Again Book Cover

The Power of Rethinking: How to Beat the Overconfidence Effect in Yourself and Others

In 1933, the philosopher Bertrand Russell wrote that “the fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” While this is just as true today as it was in the early twentieth-century, the problem actually runs deeper; almost everyone recognizes arrogance and overconfidence in others—but never in themselves.

Since the time of Russell, what’s become known as the Dunning-Kruger Effect has been experimentally validated. Research shows—and personal experience confirms—that those who are the least knowledgeable in a subject tend to be the ones who overestimate their own knowledge and abilities, while those that are full of doubt know enough about the topic to better gauge the extent of their ignorance. 

And so the telltale sign of a lack of knowledge is, paradoxically, arrogance and overconfidence, whereas in those with actual expertise you often see the opposite: humility, doubt, and open-mindedness. 

Far more people fall on the side of overconfidence. This is due, at least in part, to widespread access to the internet, where people can quickly read articles and watch videos (of varying quality and credibility) on any conceivable topic, creating the impression that one has attained deep knowledge in a subject when only a very superficial understanding has been gained. 

Overcoming this unfortunate state of affairs is the subject of organizational psychologist Adam Grant’s latest book, Think Again, which seeks to show us how to overcome our own unjustified overconfidence by developing the habits of mind that force us to challenge our own beliefs and, when necessary, to change them. 

Grant begins by telling us that when we think and talk, we often slip into the mindset of three distinct professions: preachers, prosecutors, and politicians. We become preachers when the unwarranted strength of our convictions compels us to convert others to our way of thinking; prosecutors when our sole aim is to discredit the beliefs of others; and politicians when we seek to win favors from our chosen constituency. 

What all of these mindsets have in common is the assumption that our beliefs are infallible, and that no one could possibly have anything to teach us. Trapped in the prison cell of our own dogma, we don’t set out to learn anything or update our own beliefs; our job is simply to convert others to our way of thinking because, of course, we are right. 

These habits of mental imprisonment can happen to anyone at any level of knowledge or experience, and intelligence itself has actually been shown at times to be a disadvantage, as those with high IQs have the most difficulty updating their beliefs. As Dunning himself said, “The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.” You may think all of your beliefs are correct (otherwise you wouldn’t hold them), but there is little doubt that at least some (probably many) of them are false or oversimplified. If your mind remains closed, you’ll never discover which of these beliefs require updating. 

The key question, then, is this: If most of us are unaware of the extent of our own ignorance, how can we hope to overcome our own resistance to change? 

The first step, as Grant recommends, is to detach your sense of self from any specific beliefs. If you identify with a specific set of fixed core beliefs, you will be far less likely to change your mind in the face of new evidence or better reasoning. 

Grant recommends instead to ground your sense of self in mental flexibility, taking pride in the fact that you’re willing to change your mind and update your beliefs. To achieve this, you must consider all of your beliefs to be provisional hypotheses and then seek to disprove them, in the process becoming more knowledgeable by being wrong more often. Using this approach, you will have discovered the ideal mindset for personal development and learning—not the mindset of a preacher, prosecutor, or politician, but the mindset of a scientist. 

The scientist, Grant tells us, has one overarching concern: the truth. The individual that adopts a scientific mindset will be equally motivated to challenge their own beliefs as the beliefs of others, testing hypotheses against the evidence and continually updating their beliefs in the process. 

Of course, as Grant points out, being an actual practicing scientist does not guarantee the adoption of this mindset. There are plenty of dogmatic scientists that don’t abide by the principles of their own training. The scientific mindset is not, as Grant is describing it, the mindset adopted by scientists necessarily, but rather the ideal mindset that follows the principles of science as an open-ended pursuit of knowledge that is constantly updated in the face of new evidence. 

In one interesting study described by Grant (the book is filled with fascinating examples and studies of a similar sort), two groups of entrepreneurs were provided training. One group was taught the principles of scientific thinking while the control group was not. The researchers found that the scientific-thinking group “brought in revenue twice as fast—and attracted customers sooner, too.” As Grant wrote:

“The entrepreneurs in the control group tended to stay wedded to their original strategies and products. It was too easy to preach the virtues of their past decisions, prosecute the vices of alternative positions, and politick by catering to advisers who favored the existing direction. The entrepreneurs who had been taught to think like scientists, in contrast, pivoted more than twice as often.”

Individuals that enjoy the prospect of being wrong—and so expand their knowledge more often—tend to be more successful and tend to hold more accurate, nuanced beliefs. It’s not that they lack confidence, it’s that their confidence is of a different nature. Flexible-minded individuals have confidence in their ability to learn and to unlearn beliefs that are outdated or are no longer serving them well. Their confidence lies in their ability to change and to adapt rather than in strength of their convictions concerning any single set of beliefs. As Nobel Prize-winning psychologist Daniel Kahneman put it, “Being wrong is the only way I feel sure I’ve learned anything.”

There is definitely a line to walk, and the reader may wonder just how far they should take this advice. To constantly question every one of your beliefs would result in paralyzing doubt. Sometimes, it is the strength of our convictions that give us the energy and perseverance to pursue and accomplish our goals. So this is surely a balancing act, and while we all have to find the sweet spot between timidity and arrogance, conviction and doubt, there is little question that too many of us tend toward the extreme of overconfidence. 

After showing us how to become better rethinkers ourselves, in the second part of the book we learn how to open other people’s minds. Grant shows us how world-class debaters win debates, how a black musician talked white supremacists out of their bigoted views, and how doctors persuaded anti-vaxxers to get their children immunized. 

In every case, we learn the same lesson in the art of persuasion: to change someone else’s mind, you have to help them find their own internal motivation to change. 

This is not easy. The mindsets we typically slip into tend to have the opposite effect. Act as a preacher, and people will resist being told what to think (even if the facts are on your side). Act as a prosecutor, and people will resent your condescension and will become further entrenched in their original views. Act as a politician, and you’re just saying what you think people want to hear. 

None of these approaches are effective as tools of persuasion. It turns out that your best bet is to adopt, once again, the mindset of a scientist—and to try to get others to do the same. This will transform disagreements from battles to be won and lost into a collaborative pursuit of the truth.  

The most skilled negotiators, debaters, and persuaders all use similar tactics: they first find common ground and points of agreement, ask more questions to get the other person thinking deeper, present a limited number of stronger points, and introduce complexity into the topic to move the person’s thinking away from black-and-white and into shades of gray. 

It turns out that complexifying the issue is always key. Most people exhibit what psychologists call binary bias, or the “basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories.” If you can show people—through the use of skillful questioning—that the topic they think they understand deeply (Dunning-Kruger Effect) is actually far more complex than they originally thought with more than two distinct positions, then you can plant the seeds of doubt that eventually lead to real change. 

One example Grant uses is climate change. We tend to think that people fall into one of two categories—climate-deniers or alarmists—when in fact there are six distinct positions people can take from dismissive, doubtful, or disengaged to cautious, concerned, or alarmed—with shades of nuance in between. It’s often the recognition of this complexity that can get people talking and engaged in productive debate. 

In the final part of the book, Grant shows us how to use the skills of rethinking to engage in more productive political debates, to become better teachers, and to create more innovative cultures at work. Grant provides a host of compelling examples, but my favorite is the middle-school history teacher who gets her students to think like scientists by rewriting textbook chapters that failed to cover important historical events in sufficient depth. Her students pick a time period and topic that interests them and then, through independent research, rewrite the textbook chapter, in the process cultivating the skill to always question what they read. This is a far better approach than simply delivering a lecture and forcing students to regurgitate the information on a test. 


Bertrand Russell was once asked in an interview if he was willing to die for any of his beliefs. His response was this: “Of course not. After all, I may be wrong.” 

It’s a shame that most people adopt the opposite attitude, and Grant’s latest book will go a long way to remedying this. Think Again is a timely exploration of the importance of humility and the capacity to rethink your own positions while helping others do the same. 

But in the spirit of the book—and to “complexify” the topic—it’s worth considering when displaying doubt and humility might actually backfire. Grant wonders this himself, and points out, for example, that displays of doubt and humility have been shown to have negative effects in the workplace in those who have not already established their competence. It can also be less effective when delivering a presentation to an already sympathetic audience. Does Grant downplay the frequency of these types of situations? 

Another area where excessive doubt and humility might backfire is an area that Grant fails to consider in much depth at all: arguing with bad faith actors. When discussing politics, Grant seems to assume that in most cases both sides are equally motivated by the truth, and that each side has simply failed to understand the complexity of the topic or the merits of the other side. 

But we know that this is not always the case. In politics, people have a host of motives when arguing that sometimes have very little to do with the truth: the desire for power, money, influence, and sometimes simply the desire to offend and get a rise out of people. Grant does not cover how to handle these situations—or how to identify them—and it is highly unlikely that the tactics of the book will work in these situations. 

Additionally, it seems that the masses respond better to confidence when electing political representatives, because we know that Trump was not elected based on his knowledge or competence—and certainly not on his humility. 

When dealing with bad faith actors, perhaps a good strategy would be to start with a simple question, one Grant mentions in the book: “What evidence would change your mind?” If the answer is “nothing,” then it’s probably best to walk away. Either way, a chapter or section on bad faith actors and the questions you can ask to identify them would have been a welcome addition to the book. 

But of course, this book is not the final word on the topic, and Grant wouldn’t want it to be. As we gain better evidence and more experience, it’s our responsibility to continually rethink and update our beliefs. As Russell said, “If you’re certain of anything, you’re certainly wrong, because nothing deserves absolute certainty.” 


Think Again: The Power of Knowing What You Don’t Know is available on Amazon.com.